Subscribe to the newsletter
Subscribe
To follow
Bimonthly Review of Scottish Politics and Affairs
Subscribe
Follow:
A man is photographed in front of Dublin’s famous Temple Bar, drinks a pint and smiles at the camera. Posted on Instagram, the symbol is a snapshot in time, just one of millions of images each uploaded to the social media site. day.
But why does this symbol matter so much?
He appears in an art assignment through Belgian artist Dries Depoorter called The Follower. Well known for creating tasks about surveillance, privacy, and device learning, Depoorter has used open-source cameras that are available with a device that connects to the internet.
The Follower consists of recording loose surveillance sources at known worldwide tourist sites available through EarthCam. com for two weeks. Depoorter amassed all the tagged Instagram posts at each location where he recorded the two weeks. is used to compare Instagram symbols with symbols to identify when a photo was taken.
At first glance, this is trivial. Even funny. Depoorter was able to locate the moments before and after other people created his social media posts. Is it so bad? Maybe not: it’s just the art task of one man using data sets that are readily available to everyone.
But the question arises: What is the capability of other people’s well-organized and well-funded teams running AI software within a tough government or organization?It turns out that Depoorter has shown us only the tip of the iceberg.
It is evident that something like The Follower can be achieved on a larger scale through an unscrupulous government or organization. Fabien Benetou, a member of the European Parliament’s Innovation Lab, believes there is a smart possibility that AI-assisted surveillance capability will be much more evolved than we think.
“To assume that corporations and surveillance establishments have done nothing through groups engaged with scalability and computer vision experts for years, especially with examples shared with Snowden and others, unfortunately is a bit naïve,” he says.
Albert King, head of knowledge at NHS National Services Scotland and former head of knowledge for the Scottish government, said The Follower is “voyeuristic” because it “makes other people vulnerable”. He says, “People are planned enough to create a symbol that gets straight to the point.
“I’m involved in Depoorter’s approach as it raises some of those issues. However, I think it’s a smart wake-up call for us, separately and together, and it’s an intelligent representation of the kinds of things that are possible.
Alistair Duff, emeritus professor of data policy at Edinburgh’s Napier University, said Depoorter’s assignment shows that implementing artificial intelligence software to collect and perceive knowledge posted online by members of the public can indirectly invade their privacy if they don’t perceive the potential consequences.
“People consented to it in a safe way, but possibly they wouldn’t have consented to the consequences in the future, especially with the combination of materials. “
Duff says digitization and digitization are the reasons other people lose more privacy. Videos posted on social media can be “a real problem” because “there’s a forever recording. “Before that, members of the public enjoyed a “practical darkness,” but now, recordings and messages are permanent, in one form or another, online.
As the tactics in which we can attach knowledge with the help of AI become more sophisticated, the movements of people in person will be revealed in more detail. King explains that the advent of AI is a “sea change” and that it is a “matter of scale. “”.
“If we put all this data about ourselves in the public domain, with AI, things like the popularity of symbols can identify Americans from social media posts, whether you post them yourself or not,” he says.
“Using AI to attach those knowledge problems creates a network of connected data that can be analyzed. Then, we can temporarily extract more meaningful knowledge. Start creating a detailed picture of yourself, who you are, who you interact with, and what you do.
“But the most important thing to remember is that while AI creates risks, which surely need to be addressed, it is also a generation that can bring benefits, help save lives, facilities and grow our economy. The key is to work together to address those risks.
Speaking of the permanent recordings users leave on social media, Duff believes: “Public privacy is pretty dead in the water. There will have to be a harvest.
“People have been defenestrated and marginalized and worse. I think there has to be tolerance because a lot of these personal things wouldn’t have fallen into the public domain [before social media]. I think it’s very illiberal and frankly self-righteous to dig up this kind of thing.
While there are concerns about privacy, there are also apparent potential benefits to other people if AI can be implemented the right way, King says. This will have to be done in a way that is “transparent and protects people’s rights and privacy,” while ensuring that we perceive the limitations of knowledge and do not overdo it.
It highlights the work that was done at the beginning of the pandemic with Google’s mobility data and how the virus has affected the movement of other people around the world. Linking this to economic activity has provided “an important picture and perspective of the pandemic. “
The Scottish Government used people’s knowledge to control the pandemic when it brought NHS Scotland’s Protect Scotland test and trace app. It used people’s Bluetooth to measure contact between app users and knowledge entered through users to help better track the spread of Covid-19. Scotland has also proven that it is capable of creating effective software effectively, charging the Scottish government just £300,000, while English-language application software charges it around £36 million.
But that’s not the whole picture. While Scotland has done well in the spaces of public knowledge gathering projects, a recent Freedom of Information request from the Scottish Liberal Democrats revealed that at least thirteen Scottish councils are surveillance cameras manufactured lately through Hikvision, a company allegedly linked to the repression of Uighurs. Hikvision’s facial popularity cameras have Chine. Il reportedly used to distinguish entire populations of ethnic minorities, putting Tibetans and Uighurs at serious risk.
The cameras continued to be installed even after a report by the British Parliament’s Foreign Affairs Committee connected Hikvision to human rights abuses and concluded they were not winning government contracts. Hikvision said it respects human rights.
Scottish Liberal Democrat leader Alex Cole-Hamilton said: “It is incredibly disappointing that only at least a dozen councils use those cameras and yet have continued to install them after the Foreign Affairs Committee connected Hikvision to human rights abuses. “
“There have been warnings that Hikvision is offering surveillance equipment to the Chinese government. The Scottish and UK governments want to get off the fence and introduce stricter regulations on partnerships with Chinese companies.
Looking at other countries, such as China, that are more complex in their implementation of AI and other technologies in life, it has emerged that their citizens have lost facets of their privacy.
Duff says, “I interviewed a wonderful anthropologist in Silicon Valley a few years ago and asked, don’t you think it’s terrible how privacy disappears?
“He said you have a very local view of culture: the Chinese are satisfied with that, they are my community,” he told me.
China is by far the most globally monitored population on the planet, and facial popularity is used in a multitude of fields, including in garbage collection and distribution of toilet paper rolls. But it turns out that attitudes toward privacy are becoming in China. .
In 2021, a Beijing think tank asked 1,515 anonymous Chinese citizens if they thought the generation would be used in advertising areas, and nearly 90 percent of others didn’t need it. 70 percent felt it would also not be used in residential areas.
King criticizes the way China has used AI in recent years: “There is no regret about how they use those technologies in a way that, frankly, we wouldn’t settle for in the UK and Europe.
“One component of this is about protecting people’s rights and privacy, but I’m also concerned that the Chinese state is beyond the functions of technology. There are considerations about bias in algorithms and the failure of symbol algorithms.
“There are a lot of stories coming out of China that facial popularity algorithms just make bad decisions. The uses of these technologies not only threaten the rights of others, but are fundamentally irrelevant and imperfect.
“I think we’re very lucky to be in a country that takes a much more considerate technique to adopt those technologies. “
The regulatory factor at the point of government to shield the right to privacy of individuals is obvious. Duff says that “the panopticon [surveillance state] is inevitable,” but that it will have to be accompanied by “good data policies and better regulations without a nanny state. “He doesn’t believe there can be a “great privacy law,” he believes other people deserve to be “influenced” before looking for “specific spaces and seeing where regulation might be necessary. “
“But even with all this, there’s no doubt that the exposure of the individual will increase, and we have to deal with that because this is the fashionable world. “
Duff believes it can be argued that “the panopticon is about a new form of collective life, and we embrace and are more connected to each other. “
“Maybe privacy will disappear. There were not many in medieval times or in the ancient world. Privacy was an invention of the Victorian era, which was when other people started having separate rooms and achieving a bit of anonymity.
“Maybe privacy is just a jolt, and we’re going back to the global village. Many other people on the left would be in favor of a society that is more communitarian than individualistic.
But King disagrees. ” I think privacy is about a non-public choice,” he says. “How do you decide to have interaction with technology, what data do you decide to put in the public domain, and to what extent do you have it on you?
“I think we, as individuals, have a lot of firmness. And we’re going to continue to be very strong about that. There is no doubt that regulators will have an effect on the cases where we do so.
“A lot of this will be a matter of non-public election. “
Holyrood provides a comprehensive Scottish policy policy, providing award-winning reports and analysis: Subscribe
Stay with our bimonthly magazine
Subscribe
Direct debit subscriptions from £89. 10
Subscribe
Registered office: Floor, The Shard, 32 London Bridge Street, London SE1 9SG
Commercial number: 04267888
© Merit Group plc[removed](” ” new Date(). getFullYear());