The recent rise of AI has led to many controversies surrounding its use. Those conflicts have made their way into school environments, raising questions about its necessity and positive benefits.
Schools across the country have adopted a new form of surveillance: Social Sentinel. Its intended use is admirable, that is proactively handling possible security concerns in the digital townsquare of social media.
“I think it’s almost appropriate to use for security concerns, assuming that the system works well,” sophomore Ethan Elliott said.
In reality, Social Sentinel has been used in many schools as a means to monitor protests on campus, especially in universities. 37 colleges in America have admitted their use of the software as stated in the Dallas Morning News but the owner has said hundreds of colleges in 36 states pay for the application.
“On a home you advertise your security system,” Elliott said.
Elliott points out that other forms of surveillance are usually advertised as showing some transparency, but many schools use Social Sentinel without their knowledge.
“In school computers there are some tools that react if there is something that is a safety concern,” principal Abby Hunt said.
This form of surveillance is common knowledge to the students and staff at schools and consent is signed off on at the beginning of the school year.
“There is transparency behind it,” Hunt said.
Ballards transparency with this use of AI does not raise any concerns within the affected community. The school computers reaction to security concerns rather than Social Sentinels proactive surveillance also avoids any questions of necessity.
While Hunt does provide her insights, she also wishes for the community to know that BHS is not personally affiliated with any use of AI regarding social media surveillance.
Not only are there debates around the lack of transparency, but there have also been reports of its outright misuse. Social Sentinel has in fact encouraged the use of the software to “forestall” and “mitigate” protests, as found by the Dallas Morning News.
A student from North Carolina A&T recounts school administration sifting through her social media after posting about an alleged mishandling of her rape allegations, according to the Dallas Morning News and the Harvard Crimson.
“If we are talking about active surveillance, that sounds concerning,” Hunt said.
The founder of Social Sentinel, Gary J. Margolis, created the program after working for the largest firm of professional services for universities, colleges and K-12 schools. After working in these various levels of education, the impact social media had on students was evident.
In 2020, Social Sentinel joined Navigate 360, a program that is used by governmental bodies, schools and small businesses alike to promote social welfare and identify security concerns.
Schools’ surveillance of public accounts does not actively stop a student’s right to free expression and first Amendment rights.
“You’re still able to post whatever you want,” Elliott said.
But does the fact that Social Sentinel only has access to public accounts diminish the validity of people arguing against its use?
“If you post something publicly you should expect everyone to be able to look at it,” Elliott said.
Without that added firewall protecting private accounts, public accounts might be denied the same level of privacy.
“The AI is automating it so that everytime you post something dumb you get caught,” Elliott said.
While considering its overall effectiveness Elliott shared that unnecessary flagging that reflect certain biases have also been reported with the use of AI in schools. An associate professor at Stanford explains that AI does perpetuate bias issues handed down by its makers, so much so that schools must take this into account before blindly believing in AI reports.
“As long as it isn’t falsely flagging people, or flagging people off of bias or certain racial groups more, then I think it is a good thing to have,” Elliott said.
This, however, is exactly the reality of AI. As explained by a World in Black article, the lack of diversity in the tech industry somewhat explains any social biases shown within AI software. AI designers don’t usually take into account the culture and experiences surrounding minorities in school communities.
When asked about any benefits regarding the adoption of Social Sentinel in BHS, Hunt responded with, “I cannot speak to that that would be a school board policy question. This is new information and I would like to better familiarize myself with it.”
However, Elliott does provide his own insights.
“Once I can know that it is going to be a high net positive for the school to use it, great,” Elliott said.
Debates surrounding the use of AI have been widespread, showing many perspectives to the growing argument. Whether it should be utilized in a school setting is up to administrators and board members.