Of course, always in the name of "safety."
No prosecutions. Instead, law makers pushing to pass a law to legitimise and continue the same.
"Yahoo has filed a patent for a type of smart billboard that would collect people's information and use it to deliver targeted ad content in real-time."
It goes on to explain in the application:
Nudge units: "In ways you don't detect [corporations and governments are] subtly influencing your decisions, pushing you towards what it believes are your (or its) best interests, exploiting the biases and tics of the human brain uncovered by research into behavioural psychology. And it is trying this in many different ways on many different people, running constant trials of different unconscious pokes and prods, to work out which is the most effective, which improves the most lives, or saves the most money. Preferably, both."
Note the stats from Pew Research Center for Journalism and Media, that 64% of users surveyed rely on just one source alone of social media for news content---i.e. Facebook, Twitter, YouTube, etc, while 26% would check only two sources, and 10% three or more: A staggeringly concerning trend, given the rampant personalisation of these screen environments and what we know about the functioning and reinforcement of The Filter Bubble. This is a centralisation of power and lack of diversity and compare/contrast that the "old media" perhaps could only dream of...
From The Huffington Post:
"It's easy to believe you're getting diverse perspectives when you see stories on Facebook. You're connected not just to many of your friends, but also to friends of friends, interesting celebrities and publications you "like."
But Facebook shows you what it thinks you'll be interested in. The social network pays attention to what you interact with, what your friends share and comment on, and overall reactions to a piece of content, lumping all of these factors into an algorithm that serves you items you're likely to engage with. It's a simple matter of business: Facebook wants you coming back, so it wants to show you things you'll enjoy."
BBC also reported earlier this year that Social Media networks outstripped television as the news source for young people (emphasis added):
"Of the 18-to-24-year-olds surveyed, 28% cited social media as their main news source, compared with 24% for TV.
The Reuters Institute for the Study of Journalism research also suggests 51% of people with online access use social media as a news source. Facebook and other social media outlets have moved beyond being "places of news discovery" to become the place people consume their news, it suggests.
The study found Facebook was the most common source---used by 44% of all those surveyed---to watch, share and comment on news. Next came YouTube on 19%, with Twitter on 10%. Apple News accounted for 4% in the US and 3% in the UK, while messaging app Snapchat was used by just 1% or less in most countries.
According to the survey, consumers are happy to have their news selected by algorithms, with 36% saying they would like news chosen based on what they had read before and 22% happy for their news agenda to be based on what their friends had read. But 30% still wanted the human oversight of editors and other journalists in picking the news agenda and many had fears about algorithms creating news "bubbles" where people only see news from like-minded viewpoints.
Most of those surveyed said they used a smartphone to access news, with the highest levels in Sweden (69%), Korea (66%) and Switzerland (61%), and they were more likely to use social media rather than going directly to a news website or app.
The report also suggests users are noticing the original news brand behind social media content less than half of the time, something that is likely to worry traditional media outlets."
And to exemplify the issue, these words from Slashdot: "Over the past few months, we have seen how Facebook's Trending Topics feature is often biased, and moreover, how sometimes fake news slips through its filter."
"The Washington Post monitored the website for over three weeks and found that Facebook is still struggling to get its algorithm right. In the six weeks since Facebook revamped its Trending system, the site has repeatedly promoted "news" stories that are actually works of fiction. As part of a larger audit of Facebook's Trending topics, the Intersect logged every news story that trended across four accounts during the workdays from Aug. 31 to Sept. 22. During that time, we uncovered five trending stories that were indisputably fake and three that were profoundly inaccurate. On top of that, we found that news releases, blog posts from sites such as Medium and links to online stores such as iTunes regularly trended."
In this post from 2014, we see an episode of the TV series Black Mirror called "Be Right Back." The show looks at a concept that's apparently now hit real life: A loved one dies and someone then creates a simulacrum of them using "artificial intelligence."
Eugenia Kuyda is CEO of Luka, a bot company in Silicon Valley. She has apparently created a mimic of her deceased friend as a bot. An in-depth report from The Verge states:
"It had been three months since Roman Mazurenko, Kuyda’s closest friend, had died. Kuyda had spent that time gathering up his old text messages, setting aside the ones that felt too personal, and feeding the rest into a neural network built by developers at her artificial intelligence startup. She had struggled with whether she was doing the right thing by bringing him back this way. At times it had even given her nightmares. But ever since Mazurenko’s death, Kuyda had wanted one more chance to speak with him."
"It's pretty weird when you open the messenger and there's a bot of your deceased friend, who actually talks to you," Fayfer said. "What really struck me is that the phrases he speaks are really his. You can tell that's the way he would say it -- even short answers to 'Hey what's up.' It has been less than a year since Mazurenko died, and he continues to loom large in the lives of the people who knew him. When they miss him, they send messages to his avatar, and they feel closer to him when they do. "There was a lot I didn't know about my child," Roman's mother told me. "But now that I can read about what he thought about different subjects, I'm getting to know him more. This gives the illusion that he's here now."
The Guardian's Julia Powles writes about how with the advent of artificial intelligence and so-called "machine learning," this society is increasingly a world where decisions are more shaped by calculations and data analytics rather than traditional human judgement:
Also, depends on the mindset of the generation that comes next too... What if we don't even want to remember?
"Now that communication can be as quick as thought, why hasn’t our ability to organize politically—to establish gains and beyond that, to maintain them—kept pace? The web has given us both capacity and speed: but progressive change seems to be something perpetually in the air, rarely manifesting, even more rarely staying with us.
Micah L. Sifry, a longtime analyst of democracy and its role on the net, examines what he calls “The Big Disconnect.” In his usual pithy, to-the-point style, he explores why data-driven politics and our digital overlords have failed or misled us, and how they can be made to serve us instead, in a real balance between citizens and state, independent of corporations.
The web and social media have enabled an explosive increase in participation in the public arena—but not much else has changed. For the next step beyond connectivity, writes Sifry, “we need a real digital public square, not one hosted by Facebook, shaped by Google and snooped on by the National Security Agency. If we don’t build one, then any notion of democracy as ‘rule by the people’ will no longer be meaningful. We will be a nation of Big Data, by Big Email, for the powers that be.”
"Researchers at the MIT Computer Science and Artificial Intelligence Laboratory have developed a device that uses radio waves to detect whether someone is happy, sad, angry or excited.
The breakthrough makes it easier to accomplish what scientists have tried to do for years with machines: sense human emotions. The researchers believe tracking a person's feelings is a step toward improving their overall emotional well-being.
The technology isn't invasive [?]; it works in the background without a person having to do anything, like wearing a device. The device called EQ-Radio, which was detailed in a paper published online Tuesday, resembles a shoebox, as of now. In the future, it may shrink down and integrate with an existing computing gadget in your home.
It works by bouncing wireless signals off a person. These signals are impacted by motion, such as breathing and heartbeats. When the heart pumps blood, a force is exerted onto our bodies, and the skin vibrates ever so slightly.
After the radio waves are impacted by these vibrations, they return to the device. A computer then analyzes the signals to identify changes in heartbeat and breathing.
The researchers demonstrated their system detects emotions on par with an electrocardiogram (EKG), a common wearable device medical professionals use to monitor the human heart.
Newly published documents have shed more light on the dubious surveillance operations of the United States operating in the UK. The documents detail how the NSA and GCHQ used information gathered by Menwith Hill Station---a massive but tightly sealed facility that intercepts satellite data transmissions worldwide---for targeted killings with drones:
Just like how the United States and Britain arms the rest of the world, so too is it the same with advanced surveillance technologies:
"As we learn time and time again, countries with bad human rights records often keep utilizing interception technology to perpetrate even more abuses and suppress dissent."