News & Analysis

The Two Sides of Generative AI Tools

On the one hand a global tech giant has banned its use in their premises and on the other a US app is using it for better neighborhood collaboration

Like it or loathe it, but one cannot ignore it. This cliche explains the world view on generative AI, more specifically ChatGPT, which somehow seems to surprise us each time we go about trying to find its limitations. But, today we take a look at two incidents, separated by a few thousand miles, involving two entities that personify what this next-gen technology has come to mean. 

Corporate ban on ChatGPT, albeit temporary

The first of these is tech giant Samsung, which saw sensitive data leak on to ChatGPT a month ago and has reportedly cracked down on its usage in its premises. The ban, reportedly temporary in nature, covers both company-owned hardware and those not owned by Samsung but still operating on internal networks. 

Of course, OpenAI need not feel isolated as the Samsung ban, reported by Bloomberg via an office memo, covers all such technology including Microsoft’s Bing, Google’s Bard and anything and everything in between. Of course, the ban applies only to devices used by Samsung staff and not consumers owning its hardware and connected devices. 

The company memo, quoted by the agency, specifically stated that the ban would last only till the company “builds security measures to create a  secure environment for safely using generative AI to enhance employees’ productivity and efficiency.” If one reads between these lines, it could only mean Samsung may soon have its own in-house AI tools. 

Your friendly neighborhood tension diffuser

And while all of this was getting acted out in faraway Korea, the neighborhood social network Nextdoor decided to launch new features powered by generative AI. The “assistant” feature will help users on the app write better posts with help from ChatGPT in order to drive positive community engagement. 

So, when a user starts writing a post, they will receive suggestions that can be reviewed and edited on the fly. The assistant would also suggest ways to rephrase potentially harmful comments to make them more friendly and acceptable to the neighborhood. Now, that’s a feature several of our social media users could use as a means to reduce the digital vitriol. 

A report in TechCrunch quoted Nextdoor CEO Sarah Friar to suggest that the ChatGPT launch last year was a right fit for the company to create something that helps users to get more traction on their posts. In other words, it’s giving suggestions to users to rephrase or reword a post in order to get more people to respond or react to it. 

A future tech that requires present regulation

Having seen two sides of the coin, where does that leave us? To be honest, ChatGPT is surely a technology for the future with more and more use cases likely to come up in the future. Does it really help create sticky content with clear differentiators? Well, no! Not at the moment at least. At best it provides a first cut version on the topic that one’s engaged with. 

Of course, there is also the legal side of things. The field of generative AI is already meeting some significant challenges such as the issue of proprietary data finding its way into the public domain. (Ask Samsung!). Many have flagged potential violations of data privacy, copyright issues and inaccuracies in ChatGPT’s responses.

More specifically, Samsung’s major issue with the generative AI is the challenge that it poses to retrieve and delete data on the external servers. Since the data, so transmitted by AI tools, doesn’t get disclosed to other users. In fact, over 65% of the company admitted that use of such tools carried a security risk.  

Faced with such issues, OpenAI has already unveiled a plan to introduce new privacy controls even as several large banks across the world have restricted employee access to ChatGPT. All of this is happening in another universe while social networks are finding uses for generative AI in creating new communities and making them work. 

Leave a Response