Skip to content

Where different views on Israel and Judaism are welcome.

  • Home
  • Subscribe / donate
  • Events calendar
  • News
    • Local
    • National
    • Israel
    • World
    • עניין בחדשות
      A roundup of news in Canada and further afield, in Hebrew.
  • Opinion
    • From the JI
    • Op-Ed
  • Arts & Culture
    • Performing Arts
    • Music
    • Books
    • Visual Arts
    • TV & Film
  • Life
    • Celebrating the Holidays
    • Travel
    • The Daily Snooze
      Cartoons by Jacob Samuel
    • Mystery Photo
      Help the JI and JMABC fill in the gaps in our archives.
  • Community Links
    • Organizations, Etc.
    • Other News Sources & Blogs
    • Business Directory
  • FAQ
  • JI Chai Celebration
  • [email protected]! video

Search

Archives

Recent Posts

  • Celebrating 25 years
  • “Never again” still resonates?
  • Every person has a voice
  • The kill fee – and its fallout
  • Investing in the climate
  • Shalhevet honours Vivian Claman
  • Health workers’ courage
  • A grandfather’s story – available online to May 7
  • SFU students vote BDS
  • With a Song returns to forum
  • StandWithUs Canada course
  • Chance to meet local artists
  • Mixing global music, dance
  • Our relationship to objects
  • Orcas inspire creative music
  • A full life post-career
  • Mausoleum work to resume
  • רכישות צבאיות על ידי ממשלת קנדה
  • Helping Ukrainian refugees
  • The scarcity of water
  • Celebrate Israel with joy
  • Happy 74th birthday, Israel!
  • Fundraiser for Ukraine
  • A celebration of dance
  • 25 years since deadly crash
  • Art Vancouver fair returns
  • Presenting Mixed Repertoire
  • Fostering literacy, education
  • BGU tackles climate change
  • Revitalizing community
  • Angels still giving
  • Community milestones … Federation campaign success & Hebrew U donation
  • Immigration challenges
  • A religious pilgrimage

Recent Tweets

Tweets by @JewishIndie

Tag: privacy

Our rights in the age of AI

Our rights in the age of AI

Dr. Rumman Chowdhury, chief executive officer and founder of Parity, gave the keynote address at the Simces & Rabkin Family Dialogue on Human Rights. (photo from rummanchowdhury.com)

Data and social scientist Dr. Rumman Chowdhury provided a wide-ranging analysis on the state of artificial intelligence and the implications it has on human rights in a Nov. 19 talk. The virtual event was organized by the Canadian Museum for Human Rights in Winnipeg and Vancouver’s Zena Simces and Dr. Simon Rabkin for the second annual Simces & Rabkin Family Dialogue on Human Rights.

“We still need human beings thinking even if AI systems – no matter how sophisticated they are – are telling us things and giving us input,” said Chowdhury, who is the chief executive officer and founder of Parity, a company that strives to help businesses maintain high ethical standards in their use of AI.

A common misperception of AI is that it looks like futuristic humanoids or robots, like, for example, the ones in Björk’s 1999 video for her song “All is Full of Love.” But, said Chowdhury, artificial intelligence is instead computer code, algorithms or programming language – and it has limitations.

“Cars do not drive us. We drive cars. We should not look at AI as though we are not part of the discussion,” she said.

screenshot - In her presentation Nov. 19 at the Simces & Rabkin Family Dialogue on Human Rights, Dr. Rumman Chowdhury highlighted the 2006 Montreal Declaration of Human Rights.
In her presentation Nov. 19 at the Simces & Rabkin Family Dialogue on Human Rights, Dr. Rumman Chowdhury highlighted the 2006 Montreal Declaration of Human Rights.

The 2006 Montreal Declaration of Human Rights has served as an important framework in the age of artificial intelligence. The central tenets of that declaration include well-being, respect for autonomy and democratic participation. Around those concepts, Chowdhury addressed human rights in the realms of health, education and privacy.

Pre-existing biases have permeated healthcare AI, she said, citing the example of a complicated algorithm from care provider Optum that prioritized less sick white patients over more sick African-American patients.

“Historically, doctors have ignored or downplayed symptoms in Black patients and given preferential treatment to white patients – this is literally in the data,” explained Chowdhury. “Taking that data and putting it into an algorithm simply trains it to repeat the same actions that are baked into the historical record.”

Other reports have shown that an algorithm used in one region kept Black patients from getting kidney transplants, leading to patient deaths, and that COVID-19 relief allocations based on AI were disproportionately underfunding minority communities.

“All algorithms have bias because there is no perfect way to predict the future. The problem occurs when the biases become systematic, when there is a pattern to them,” she said.

Chowdhury suggested that citizens have the right to know when algorithms are being used, so that the programs can be examined critically and beneficial outcomes to all people can be ensured, with potential harms being identified and corrected responsibly.

With respect to the increased use of technology in education, she asked, “Has AI ‘disrupted’ education or has it simply created a police state?” Here, too, she offered ample evidence of how technology has sometimes gone off course. For instance, she shared a news report from this spring from the United Kingdom, where an algorithm was used by the exam regulator Ofqual to determine the grades of students. For no apparent reason, the AI system downgraded the results of 40% of the students, mostly those in vulnerable economic situations.

Closer to home, a University of British Columbia professor, Ian Linkletter, was sued this year by the tech firm Proctorio for a series of tweets critical of its remote testing software, which the university was using. Linkletter shared his concerns that this kind of technology does not, in his mind, foster a love of learning in the way it monitors students and he called attention to the fact that a private company is collecting and storing data on individuals.

To combat the pernicious aspects of ed tech from bringing damaging consequences to schooling, Chowdhury thinks some fundamental questions should be asked. Namely, what is the purpose of educational technology in terms of the well-being of the student? How are students’ rights protected? How can the need to prevent the possibility that some students may cheat on exams be balanced with the rights of the majority of students?

“We are choosing technology that punishes rather than that which enables and nurtures,” she said.

Next came the issue of privacy, which, Chowdhury asserted, “is fascinating because we are seeing this happen in real-time. Increasingly, we have a blurred line between public and private.”

She distinguished between choices that a member of the public may have as a consumer in submitting personal data to a company like Amazon versus a government organization. While a person can decide not to purchase from a particular company, they cannot necessarily opt out of public services, which also gather personal information and use technology – and this is a “critical distinction.”

Chowdhury showed the audience a series of disturbing news stories from over the past couple of years. In 2018, the New Orleans Police Department, after years of denial, admitted to using AI that sifted through data from social media and criminal history to predict when a person would commit a crime. Another report came from the King’s Cross district of London, which has one of the highest concentrations of facial-recognition cameras of any region in the world outside of China, according to Chowdhury. The preponderance of surveillance technology in our daily lives, she warned, can bring about what has been deemed a “chilling effect,” or a reluctance to engage in legitimate protest or free speech, due to the fear of potential legal repercussions.

Then there are the types of surveillance used in workplaces. “More and more companies are introducing monitoring tech in order to ensure that their employees are not ‘cheating’ on the job,” she said. These technologies can intrude by secretly taking screenshots of a person’s computer while they are at work, and mapping the efficiency of employees through algorithms to determine who might need to be laid off.

“All this is happening at a time of a pandemic, when things are not normal. Instead of being treated as a useful contributor, these technologies make employees seem like they are the enemy,” said Chowdhury.

How do we enable the rights of both white- and blue-collar workers? she asked. How can we protect our right to peaceful and legitimate protest? How can AI be used in the future in a way that allows humans to reach their full potential?

In her closing remarks, Chowdhury asked, “What should AI learn from human rights?” She introduced the term “human centric” – “How can designers, developers and programmers appreciate the role of the human rights narrative in developing AI systems equitably?”

She concluded, “Human rights frameworks are the only ones that place humans first.”

Award-winning technology journalist and author Amber Mac moderated the lecture, which was opened by Angeliki Bogiatji, the interpretive program developer for the museum. Isha Khan, the museum’s new chief executive officer, welcomed viewers, while Simces gave opening remarks and Rabkin closed the broadcast.

Sam Margolis has written for the Globe and Mail, the National Post, UPI and MSNBC.

***

Note: This article has been corrected to reflect that it was technology journalist and author Amber Mac who moderated the lecture.

Format ImagePosted on December 4, 2020December 7, 2020Author Sam MargolisCategories LocalTags AI, Canadian Museum for Human Rights, CMHR, dialogue, education, health, human rights, privacy, Rumman Chowdhury, Simon Rabkin, technology, Zena Simces
The dangers of drones

The dangers of drones

An image of drones filling the sky from Reva Stone’s Falling. (photo from Reva Stone)

Multi-awarding-winning Winnipeg artist Reva Stone researched drones for three years and then began creating art to share some of what she had learned about how the technology affects our lives. The exhibit erasure, which comes from that research, features three works – Falling, Atomic Bomb and Erase. It is on display at the University of Manitoba’s School of Art Gallery until April 26.

“I’m very much an observer of what’s going on with new technologies, so when I saw the impact that UAVs [unmanned aerial vehicles] were starting to have – especially with war and changing the nature of war – I applied for and got a Canada Council [for the Arts] grant to do a lot of research and reading about what actually is happening,” Stone told the Independent.

She went so far as to get two quadcopters, to understand what they really sounded like, and hoping to use them in her art, which she has.

“I was working on this, and then I started thinking about our skies filling up with these commercial and militarized drones and how they were basically machines … that could fall out of the sky … that could crash into each other, that could bring down an aircraft. We were filling up our skies,” she said. “And then, about two years ago, I was reading and realized that we were now targeting not other countries, but targeting humans.”

photo - Artist Reva Stone’s exhibit erasure warns about the use of drones in our society
Artist Reva Stone’s exhibit erasure warns about the use of drones in our society. (photo from Reva Stone)

Stone ended up making five or six individual pieces that deal with different aspects of the use of drones, but relate to one another. Depending on the exhibition venue, she decides which ones will work best together in a particular space.

Originally, drones were developed for spying purposes for the military. Later versions were outfitted with weapons for protection and assault. More recently, commercial drones have been developed. Now, anyone can buy a drone for as little as $20. This easy accessibility is challenging our society, contends Stone, causing hazards to planes in airports, affecting people at parks and disrupting the peace.

“Drones are becoming these things that fly in the air that have no human controllers … that are almost autonomous,” she said.

Stone often uses computers, movies, motors and speakers to help fully immerse visitors in her art pieces.

The work Falling, she said, “is an animated video that I made that has to do with what I see as a very new future, wherein UAVs are ubiquitous, because of civilian, military, commercial and private use.

“It’s almost slow motion or balletic on a massive screen,” she said. “There’s constant falling out of the skies, sometimes flipping as they fall. Sometimes, there’s a drone that has exploded in the sky … sometimes, small and far away and, sometimes, they’re so big when they fall through the sky that they look almost life-size and you’ll have to back away from the screen … that will be the feeling you get. Then, sometimes, there are these little windows that open up and you look through, into another world, and that world is more about what we’re fighting about – the fact that we are actually using these to make war. Other than that, some of them are commercial, some are cute, some are scary looking … and it’s like a continuous rain coming down.”

Atomic Bomb is also a film.

“I started with an early atomic bomb explosion,” said Stone. “It was a 30-second film and I made it into an almost 20-minute video. I really slowed it down and altered the time to give the impression that the person in the exhibition space is looking at a still image caught in time. I show this video together with texts that I found speak to the history of the use of radio-controlled airplanes and UAVs, and to longheld ideas about collateral damage – the relationship between … the use of atomic bombs and the use of drones and collateral damage, which, to me, is a huge issue with the use of drones as military.”

photo - A single frame from Reva Stone’s Atomic Bomb video piece
A single frame from Reva Stone’s Atomic Bomb video piece. (photo from Reva Stone)

The first text is from Harry Truman, the American president who made the decision in the Second World War to use the bomb, and it reads: “The world will note that the first atomic bomb was dropped on Hiroshima, a military base. That was because we wished, in this first attack, to avoid, insofar as possible, the killing of civilians.”

The next one is from John Brennan, Central Intelligence Agency director from 2013 to 2017: “There hasn’t been a single collateral death because of the exceptional proficiency, precision of the capabilities we’ve been able to develop.”

According to Stone, “This is just bullshit. But this is part of the cleaning up of the media presentation of all these ideas and all these things I’ve been researching, that I’ve been noticing going on over time. And, it has actually made me change the name of the work. I was going to call all three of them a totally different name. Recently, maybe a month ago, I changed it to erasure because of the erasure of people, the erasure of a lot of critical dialogue that’s been happening since I started researching in 2015 … how we are mediated, what we are presented with as a culture. The info is so mediated by how it’s reported, and if it’s reported.”

Stone wants “her audience to consider how the capabilities of such technology may be turned against citizens and how governments might, and do, get away with employing them in the name of patriotism in ways that ultimately test the ethical and moral values of its citizenry,” notes the exhibit description. “With news cycles moving so rapidly, the reports of deadly events quickly fall from memory, seemingly erased from public consciousness.”

The third piece, Erase, is interactive. Stone said it is based on what, in her view, the Obama administration practised – the targeting of individuals based on algorithms, mostly guilt by association.

“With this one, I’m actually replicating the procedure,” she said. “I have my two quadcopters that are doing the surveillance and capturing people in the exhibition space, unbeknownst to them. Then, they get captured and saved.

“Then, it’s a process that goes on, that they get played back. And you begin to realize that you’re under surveillance, the people in the space. And, every so often, a target comes up over one of them, one of the captured images. It’s really intense and an explosion occurs, and that person actually comes out of my captured list. That person will never show again. They’ve been erased.”

The exhibit erasure opened Feb. 7. For more information about Stone and her work, visit revastone.ca.

Rebeca Kuropatwa is a Winnipeg freelance writer.

Format ImagePosted on March 15, 2019March 14, 2019Author Rebeca KuropatwaCategories Visual ArtsTags art, atomic bomb, cultural commentary, democracy, drones, military, privacy, Reva Stone, technology, war

Genetic testing and privacy

Genetic testing can save lives. So, why isn’t everyone getting it done? It turns out that companies are using the information from the tests to discriminate against applicants.

While this is by no means a Jewish-specific issue, the National Council of Jewish Women of Canada (NCJWC) and the Centre for Israel and Jewish Affairs (CIJA) are taking the lead in urging the federal government to legislate against this discriminatory practice.

“At NCJWC, our goals are for education, service and social action,” said Sharon Allentuck, the organization’s national president. “Social action includes writing to MPs, senators and the prime minister … [about] genetic testing and insurance denial.”

Genetic testing has been high on NCJWC’s list of priorities for the past 25 to 30 years and it continues to be – not just with respect to concerns over insurance companies’ actions, but also to increase public awareness of the importance of genetic testing.

In Winnipeg, for example, a clinic is held every three to four years in conjunction with Health Sciences Centre geneticist Dr. Cheryl Greenberg. While, in the past, the main focus was on Tay-Sachs, the list keeps getting larger, as geneticists like Greenberg discover new gene connections. At the moment, the list stands at seven to eight different Jewish genetic diseases being studied.

By getting a test done, one can be aware of a possible genetic problem that might affect oneself or one’s children, if a person has children with another carrier of the same disease. This knowledge can provide people with peace of mind when choosing a partner.

So far, though, this knowledge has come with a cost. When people apply for insurance, they are asked to disclose the results of their genetic testing.

“It came to our attention that insurance companies said to some people, ‘You’ve been tested, genetically. You have certain predispositions. Sorry, but we’re going to deny you insurance,’” said Allentuck. “It’s against human rights, it’s discriminatory. Canada is the only G7 country that allows this to happen. And so, legislation [Bill S-201] preventing that discrimination was passed through the Senate and now it’s in the House of Commons. We are asking our members and are working with CIJA to encourage [Jewish community] members to contact their members of Parliament to ensure the legislation passes.”

CIJA adds on its website, “We encourage provincial legislatures to pass complementary legislation, with a specific focus on employment and insurance.”

For more information, visit NCJWC’s website or Facebook page. Allentuck encouraged readers to become NCJWC Facebook friends in order to stay regularly updated on this and other important topics.

“This isn’t a Jewish issue,” she said. “But that doesn’t mean that Jewish people can’t have a say in it.”

Rebeca Kuropatwa is a Winnipeg freelance writer.

Posted on October 28, 2016October 27, 2016Author Rebeca KuropatwaCategories NationalTags Bill S-201, genetic testing, privacy

Let’s talk about security, privacy

Last week, the federal government introduced proposed legislation intended to strengthen anti-terror powers of police, the intelligence service and the military.

The legislation would make it illegal to advocate or promote terrorism, would allow courts to remove terrorist propaganda from the internet, and make it easier for authorities to apprehend suspected terrorists before they act.

Civil libertarians waded in immediately. The British Columbia Civil Liberties Association, which is already engaged in litigation against the federal government over allegations of electronic surveillance without warrants, warned that the legislation would give new powers to security agencies that have “shamefully inadequate oversight and are hostile to accountability.”

The proposed legislation comes on the heels of two terror attacks in Canada last year by apparent lone wolves in Ottawa and Saint-Jean-sur-Richelieu, Que. In its press release announcing the measures, the government pronounced the world “a dangerous place” and reminded us that “Canada is not immune to the threat of terrorism.” Fair enough.

But Canada is also not immune from the threat of government overreach. There is a very critical line and a democracy needs to struggle to find precisely the right balance around these issues. While a terror attack can come out of the blue and kill, threats to individual liberties tend to emerge more slowly and the harm they do is not as immediately clear.

Israel is probably the most illustrative example of a democratic society trying to balance individual rights with protection of civilians from determined terrorists.

The balance that Israel has struggled to find between the rule of law, protection of civilians and the preservation of core civil liberties has been one of the defining and divisive characteristics of Israeli life for decades.

Balancing the physical safety of civilians with the preservation of the freedoms that define that country invigorates a vibrant public discourse, an ongoing, hand-wringing, conscience-challenging debate that carries on with extraordinary passion in a vibrant political ferment.

Among the problems with applying the Israeli model to Canada’s is that, put simply, Canada is not Israel. Canada has had nothing even remotely comparable to the onslaught of terror attacks Israel has endured. Nothing should diminish the grief and determination we felt collectively after the two incidents last year in this country, but neither should we pretend that our society is under imminent threat of sustained, existential violence from ideological forces. That is simply not the case. Proponents of the legislation might say that we need to make sure that things do not get out of hand by getting ahead of it early. Perhaps. But then a wiser solution still would be to work with and support communities where radicalization is taking place, or threatens to take place, and empower the moderates and reformers to identify and help those at risk of succumbing to ideological extremism. There are other approaches as well.

We should not be lulled into any sense of complacency about the sort of world in which we live. But neither should we succumb to hysteria and assume that the sky is falling. Neither should we pretend that this is all white hat/black hat drama. In Canada and, especially, in the United States, in recent months, we have seen those in authority – police – shoot several innocent civilians. And we have plenty of examples of overreach by intelligence and security agencies that seem to view their constitutional limitations as mere suggestions. This may be a time to strengthen laws that protect our civilian populations from terrorists, but citizens should likewise ask when we will see legislation that ensures our civil liberties are as secure as our physical well-being.

Underpinning all of this discussion, though, is a problem far more immediate to Canadians: political polarization. Would it be too much to ask that, on an issue the federal government rhetorically insists is so extraordinarily urgent as protecting Canadians from terrorism, that they might reach across the aisle and work with opposition members, rising above partisanship to develop responses to genuine national security threats?

Imagine if, instead of a government-initiated security bill pushed through by a majority government, we engaged opposition parties and Canadian citizens to discuss and propose a consensus around these issues that balances the demand between our freedoms and our personal and collective security. That would be an exercise in democracy that would truly define the difference between the enemies who seek to destroy us and the values we cherish.

Perhaps it’s too much to expect in an election year.

Posted on February 6, 2015February 5, 2015Author The Editorial BoardCategories From the JITags privacy, terrorism
Proudly powered by WordPress