With AI, leaders weigh public safety and rights to privacy

Submitted by dlaviola on
AI banner

Mass shootings.  Bullying that won’t quit whether online or in real life. As the nation’s societal ills continue to infiltrate our schools, the tech sector is offering proactive solutions that aim to stave off self-harm or violence to others. Now with artificial intelligence, or AI, a team of school district leaders can be alerted to and respond to threats in real time.

But at what cost?

In this moment of ever-advancing technology advances, where AI can comb through public social media in an instance and where security cameras are everywhere, the human component is more essential than ever, experts say. Front and center are public safety and individual rights.

“In the security world, you’re always balancing keeping people safe and also keeping things private,” said Derek Peterson, CEO of Ronkonkoma-based Soter Technologies. “You’re always walking that tightrope.”

“It’s controversial on some ends,” said Robert Mesas, a vice president at Central Business Systems, a Melville-based full-service IT consulting company.

Yet “the goal” of most monitoring programs is to “raise awareness of self-harm and violence,,” said Mesas, whose client base includes school districts. When it comes to public safety, these clients are seeking options.

They want solutions at a time when, in this year alone, 74 people have already been killed or injured in schools in the United States, including the fatal shooting of three children and three adults last week in Nashville. And according to the Washington Post’s tracker, 2022 saw 46 school shootings, the highest number since 1999.

“School officials are under pressure from everybody, as is the government, to do something, and I don’t think anybody really who knows what to do,” Michael Nizich told LIBN. Nizich is the director of the Entrepreneurship and Technology Innovation Center and an adjunct associate professor of computer science at New York Institute of Technology. But on this angle, he said, he was speaking as a father and a citizen who, like every person interviewed in this article, wants kids to be safe.

When it comes to mass shootings, the Violence Project, a non-profit research center, states that nearly half of all perpetrators have leaked their plans in advance.  And social media can provide that platform, especially as its usage grows increasingly prevalent.

Now, Mesas said, Central Business Systems is test-piloting AI products that monitor public online and social media 24 hours a day, 365 days a year, on platforms such as Instagram and Facebook, to identify potential threats. The company is offering two products: one in which the alert can be sent directly to the school district, which then applies its own resources to assess;  and one in which a Central Business System partner assesses the flagged content and if appropriate, notifies the school, whose own team of experts applies its predetermined protocols.

“We call it web-scraping … we do it all the time, and it’s legal,” Nizich said about combing through online public posts. “Somebody’s data is out there because that person checked off a user agreement, and they made that public. They’re basically making it public for the entire world to access it.”

Data, Nizich said, “is the new oil.” In the 1800s when oil was discovered, “we had to create uses for the oil so we could sell the oil.” Now, he said, with AI algorithms and high speed, high efficiency processing capabilities, large amounts of data are used to make decisions in a wealth of industries, from finance to insurance and beyond, with endless possibilities for the future.

Yet the practice of combing through social media “could do things that cause for potential concern,” said Paul Trapani, president of Plainview-based Long Island Software & Technology Network, which fosters the region’s tech center. Issues can include defining “the boundaries of the schools” and defining “the individual rights of the students,” Trapani said.

There might also be unexpected consequences of web-scraping, Trapani said.  Take hiring. An employer might deploy AI to search social media to find everything a person ever tweeted – and that person, while young and foolish, might have tweeted something dumb. “AI can find it in a second,” and perhaps even store it in a database, he said. “It will sit there forever.”

Or suppose, Nizich said, “you have a kid who is a ‘good kid’ who makes a bad joke, and now the system is alerting the school that this might be a target.” In that same school, you might “have a really ‘bad kid’ that is smart enough to not make bad jokes” and therefore goes unnoticed.

In addition, there is the possibility that a company would look to provide results to a client because they are under contract and therefore may not be an “impartial contributor,” Nizich said.

All told, there may be “too many negatives for such a program to be a positive,” he said.

Central Business Systems, Mesas said, does not want to issue false alerts. He said that the products are customized to include key words to highlight any concerning posts. And yes, he said, there might be instances when a student could merely be posting the lyrics to a song, and that districts “don’t want to cry wolf and don’t want constant alerts.”

That’s why it’s important to have a multifaceted team assess any alerts. The team could consist of the superintendent, a counselor, a psychologist – people who know the kids, Mesas said –  who would then evaluate the situation.

“It’s very weighty,” Mesas said, adding that it’s important for school districts and their boards to meet, and discuss and develop policy.

“Parents have the right to opt their child out of the service so they don’t get monitored,” Mesas said.

Meanwhile, Soter’s Juno AI product uses artificial intelligence cloud-based emotional and behavioral analytics so that school officials can gain insights about the emotional climate of the school. For example, the program can detect if a fight is breaking out, say, on the third floor, and alert school officials who might be on the first floor. This way, they can respond in real time, rather than looking to the camera recordings after the fact to gain insights into the incident.

Soter can quickly redact faces before publicly releasing videos, aligning with any federal and organizational privacy guidelines.

The company also has the capability to study facial expressions “to determine the kids who are who are having a bad day” and alert the school “before they do something harming the school,” Peterson said.

And its FlySense Vaping Solution can detect both vaping and potential bullying that monitors air quality and sound anomalies, including escalating voices, without actually recording conversations, alerting school officials in real time if a fight breaks out in a restroom.

But, Peterson said, technology “is not a silver bullet to keep people safe.”  Noting the importance of the human touch, Peterson said the company also works with schools to create anti-vaping awareness campaigns. “The thing we want to do is help kids kick the habit.”

School districts, Mesas said, are already monitoring online activity on school-issued laptops, and using the school’s Wi-Fi on its premises.  What happens, though, when a student leaves school property, and potentially harmful behavior continues?

That’s where, some say, the monitoring can make a difference.

“In 2015, we actually saved a kid from committing suicide,” Peterson said.

Back then the company monitored social media for schools and picked up on a student who was going to set herself on fire.

“There was a video of herself on fire and our tools picked up on all that,” he said. “So that’s how we got thrust into this this marketplace.”

One thing is clear, AI is not going away.

“We are still at beginning,” Trapani said.

Soter, for example, has products in every state, and at thousands of schools, in 22 countries, with a big influx in Australia, Peterson said. And the company plans to announce a big deployment in Canada in the coming months.

Meanwhile, the debate over privacy and safety rages on. Mesas poses this question: If it can save one person’s life, is it worth implementing?

It’s why, he said, “school boards and parents need to take a deep dive.  It’s a process.”

Adina Genn LIBN //April 6, 2023 AGENN@LIBN.COM 

800.896.4249