Friday, July 14, 2023

Choose A Human Writer, Not AI


AI, or Artificial Intelligence, is being touted by some as the answer to all our problems. 

But like past years' fads, like expensive NFT cartoons and graphics, AI has problems of its own, and rushing to ditch humans in favor of machines in all fields make no sense.  

A New York lawyer last week was duped by ChatGPT and is now facing legal sanctions,  after he used  the AI model for research, insisting he didn’t realize it could lie. 

"New York aviation lawyer Steven Schwartz may face professional sanctions after a legal brief he submitted was discovered to be full of “bogus judicial decisions” and fake quotes authored by AI language model ChatGPT, according to court records published last week." Reports the BBC.

"Schwartz told the court in an affidavit on Thursday that he was using ChatGPT for legal research for the first time when he put it to work drafting the ten-page brief he hoped would convince Manhattan Federal Judge P. Kevin Castel not to dismiss a case he was advocating. He explained that he 'therefore was unaware of the possibility that its content could be false.'

It turned out that every case cited in the AI program-generated brief was fake, entirely made up by the program. 

AI-written pieces have come under scrutiny lately. But most articles about it have been glowing and optimistic about how AI and various Chat bots can help humanity. 

There are a few obvious ways In which it seems AI could be very useful. In space exploration,  over 5500 new exoplanets have been discovered in the past decades by large telescopes that have been put into orbit around the earth. Sifting Through all the data that they have collected may take decades, but using AI has greatly sped out the process. 

AI-run rockets making journeys of thousands of years may make more sense than humans, when it's time to visit another solar system.

When such superhuman tasks aren't required, however, we should keep humans employed. While more training in some fields,  like law, may be required, this attorney's experience is a lesson on overreliance on AI.

Putting millions of future attorneys and paralegals  out of work is not good for society or that profession. One can see an early backlash to relying on machines in the growing unease with "self check outs" at Walmarts and other retailers. Doctors that rely on AI and not  human experience can be envisioned making horrific mistakes in the future.

Doing away with article writers seems like a logical step to some,  but a recent Wall Street Journal article noted that while some AI articles presented to a publisher of a retirement magazine contain no grammatical mistakes, most devolved into gibberish and "nonsense," and were sifted out by the publisher. 

One suspects that news editors are doing the same thing. Professors also warn of this threat, but have sifted out plagiarism for decades.

By Stephen Abbott of Abbott Media Group

[AI was NOT used to write this article.]

Tuesday, July 4, 2023

When Parties Screw Up.. the Latest Example

Political parties,  just like candidates, need to always take care to use public events to build their images, and avoid negative stereotypes about them that could be used by opponents.  For Democrats,  the attack by the GOP that they, deep down, really HATE America, is one they need to be fighting, not reinforcing.  

But in Arizona,  the Pema County Dems this week advertised a July 4 rally in a city park  to protest the overturning of Unlimited Baby Abotions and Legalized Race discrimination in higher education by the US Supreme Court. But the wording of the flyer and name of the event,  "F**k the Fouth" said too much about their real feelings of America. 

The AZ Republican Party and even the AZ Democrats condemned the event poster, tho Dems said the event's orananizers' "anger" about the decisions was justified.

The unaltered tweet, shown below,  was deleted after it hit the media

Similarly, last year, Democrats trashed America in an Orlando, Florida city newsletter meant to discuss Fourth of July events.