For Christmas I got a fascinating present from a pal - my extremely own "very popular" book.
"Tech-Splaining for Dummies" (terrific title) bears my name and my image on its cover, and it has glowing evaluations.
Yet it was entirely written by AI, with a few basic triggers about me provided by my good friend Janet.
It's a fascinating read, and really amusing in parts. But it also meanders rather a lot, and is somewhere between a self-help book and a stream of anecdotes.
It imitates my chatty style of composing, however it's also a bit recurring, and really verbose. It might have exceeded Janet's prompts in looking at information about me.
Several sentences start "as a leading technology journalist ..." - cringe - which might have been scraped from an online bio.
There's also a mysterious, repetitive hallucination in the type of my cat (I have no pets). And there's a metaphor on almost every page - some more random than others.
There are dozens of business online offering AI-book composing services. My book was from BookByAnyone.
When I called the president Adir Mashiach, based in Israel, he informed me he had offered around 150,000 customised books, generally in the US, since pivoting from compiling AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller expenses ₤ 26. The company uses its own AI tools to generate them, asteroidsathome.net based upon an open source large language model.
I'm not asking you to purchase my book. Actually you can't - only Janet, who produced it, can order any further copies.
There is currently no barrier to anyone creating one in anybody's name, including celebs - although Mr Mashiach states there are guardrails around abusive content. Each book includes a printed disclaimer specifying that it is imaginary, developed by AI, and designed "entirely to bring humour and delight".
Legally, the copyright comes from the company, however Mr Mashiach stresses that the item is planned as a "customised gag present", and the books do not get sold further.
He intends to expand his range, generating different genres such as sci-fi, and possibly providing an autobiography service. It's designed to be a light-hearted form of customer AI - selling AI-generated goods to human customers.
It's likewise a bit scary if, like me, you write for a living. Not least since it probably took less than a minute to generate, and it does, certainly in some parts, sound similar to me.
Musicians, authors, artists and stars worldwide have revealed alarm about their work being used to train generative AI tools that then produce comparable content based upon it.
"We ought to be clear, when we are discussing information here, we really mean human creators' life works," states Ed Newton Rex, creator of Fairly Trained, which projects for AI companies to respect creators' rights.
"This is books, this is articles, this is images. It's masterpieces. It's records ... The entire point of AI training is to learn how to do something and after that do more like that."
In 2023 a song including AI-generated voices of Canadian singers Drake and The Weeknd went viral on social networks before being pulled from streaming platforms due to the fact that it was not their work and they had actually not granted it. It didn't stop the track's creator trying to nominate it for a Grammy award. And even though the artists were phony, it was still hugely popular.
"I do not think the usage of generative AI for creative purposes ought to be banned, however I do think that generative AI for these purposes that is trained on people's work without permission should be prohibited," Mr Newton Rex adds. "AI can be very effective however let's develop it morally and relatively."
OpenAI states Chinese competitors utilizing its work for their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes market and dents America's swagger
In the UK some organisations - consisting of the BBC - have selected to obstruct AI developers from trawling their online content for training functions. Others have chosen to collaborate - the Financial Times has actually partnered with ChatGPT developer OpenAI for instance.
The UK government is thinking about an overhaul of the law that would allow AI designers to use creators' content on the internet to help develop their models, unless the rights holders pull out.
Ed Newton Rex explains this as "madness".
He mentions that AI can make advances in locations like defence, healthcare and logistics without trawling the work of authors, reporters and artists.
"All of these things work without going and altering copyright law and destroying the livelihoods of the nation's creatives," he argues.
Baroness Kidron, a crossbench peer in your home of Lords, is likewise highly against eliminating copyright law for AI.
"Creative industries are wealth creators, 2.4 million tasks and a lot of pleasure," says the Baroness, who is also an advisor to the Institute for Ethics in AI at Oxford University.
"The government is weakening among its best carrying out industries on the unclear promise of development."
A government representative stated: "No move will be made until we are definitely confident we have a practical strategy that provides each of our goals: increased control for right holders to assist them certify their material, access to high-quality product to train leading AI models in the UK, and more openness for ideal holders from AI designers."
Under the UK federal government's brand-new AI plan, a national data library including public data from a large range of sources will likewise be provided to AI researchers.
In the US the future of federal rules to manage AI is now up in the air following President Trump's return to the presidency.
In 2023 Biden signed an executive order that aimed to increase the safety of AI with, to name a few things, companies in the sector required to share information of the functions of their systems with the US federal government before they are launched.
But this has now been rescinded by Trump. It stays to be seen what Trump will do rather, however he is said to want the AI sector to deal with less guideline.
This comes as a number of lawsuits versus AI companies, and particularly against OpenAI, continue in the US. They have actually been gotten by everyone from the New York Times to authors, music labels, and even a comic.
They declare that the AI firms broke the law when they took their material from the internet without their approval, and utilized it to train their systems.
The AI companies argue that their actions fall under "fair use" and are therefore exempt. There are a variety of aspects which can constitute fair usage - it's not a straight-forward meaning. But the AI sector is under increasing scrutiny over how it gathers training information and whether it must be spending for it.
If this wasn't all adequate to consider, Chinese AI company DeepSeek has shaken the sector over the past week. It became the a lot of downloaded free app on Apple's US App Store.
DeepSeek claims that it established its technology for a portion of the rate of the likes of OpenAI. Its success has raised security concerns in the US, and threatens American's existing supremacy of the sector.
As for me and a career as an author, I think that at the minute, if I really want a "bestseller" I'll still need to write it myself. If anything, Tech-Splaining for Dummies highlights the existing weak point in generative AI tools for larger tasks. It has lots of mistakes and hallucinations, and it can be quite hard to check out in parts since it's so long-winded.
But given how quickly the tech is developing, I'm not exactly sure the length of time I can stay positive that my significantly slower human writing and modifying abilities, are better.
Sign up for our Tech Decoded newsletter to follow the in worldwide innovation, with analysis from BBC correspondents worldwide.
Outside the UK? Sign up here.
1
How an AI-written Book Shows why the Tech 'Horrifies' Creatives
tajfreeman3883 edited this page 2025-02-09 19:45:20 +08:00