In this episode of the Self-Publishing News Podcast, Dan Holloway discusses a Society of Authors survey revealing significant concerns among creatives about AI's impact on their work and income. He also covers the UK government's latest updates on AI regulation, noting a lack of protections for content creators. Additionally, Dan highlights Substack's new partnership with Spotify, which enhances podcast visibility and subscriber integration.
Find more author advice, tips, and tools at our Self-publishing Author Advice Center, with a huge archive of nearly 2,000 blog posts and a handy search box to find key info on the topic you need.
And, if you haven’t already, we invite you to join our organization and become a self-publishing ally.
Listen to Self-Publishing News: AI's Impact on Creatives
On the Self-Publishing News Podcast, @agnieszkasshoes discusses a Society of Authors survey revealing significant concerns among creatives about AI's impact on their work and income. Share on XDon't Miss an #AskALLi Broadcast
Subscribe to our Ask ALLi podcast on iTunes, Stitcher, Player.FM, Overcast, Pocket Casts, or Spotify.
About the Host
Dan Holloway is a novelist, poet, and spoken word artist. He is the MC of the performance arts show The New Libertines, He competed at the National Poetry Slam final at the Royal Albert Hall. His latest collection, The Transparency of Sutures, is available on Kindle.
Read the Transcripts to Self-Publishing News: AI's Impact on Creatives
Dan Holloway: Hello, and welcome to another self-publishing news from here in a very rainy and windy Oxford.
It's been a week of several very big AI stories, so we have that to look forward to, including some fascinating insights from a survey by the Society of Authors, and the much-anticipated update from the UK government on potential regulation of AI and potential.
Substack Partners with Spotify
That is to come, but we will start with the news item that's not AI related, which is something that nonetheless will be of great interest to many of you, particularly those of you who use newsletters, and that is that Substack has been in talks with and has partnered up with Spotify.
So Substack, used by many of you for your newsletters, also used by many people for podcasts.
This integration, as will not be a surprise, will allow Substack podcasts to be broadcast through Spotify. They will be open to Spotify subscribers. They will be discoverable on Spotify.
Those podcasts that are not available to anyone other than paying subscribers through Substack will show as requiring subscription. So, you will still have to be a Substack subscriber to listen to them, but they will nonetheless still be discoverable through Spotify.
So, for authors looking for a bigger audience, you will still be able to use Spotify's algorithms to get people to find their way to your podcast, and then you will be able to entice them to subscribe through your Substack and then listen through Spotify. More interesting developments there from Spotify.
Society of Authors Survey Highlights Creators Concerns Around AI
The main news this week is the Society of Authors have been undertaking a survey of everyone in the creative industries and what their thoughts are about AI; how they feel they will be affected in the future, whether they are using AI, and whether they feel they've actually been affected already.
It's not going to come as any surprise that a lot of people do feel they have been affected already. Interestingly, illustrators and translators feel particularly as though their work has been already devalued by AI. So, they're already being undercut, they're already getting fewer opportunities. That's around a third of respondents in those categories already feeling that.
But when it comes to worries about who is going to suffer in the future, that's over half of everyone. So, fiction writers, two thirds. This is really interesting, non-fiction writers, 57 percent feel they are going to be affected in terms of their income in the future, whereas 65 percent of fiction writers. That feels really strange given what we have seen already around journalism and copy, which is clearly affecting those who write non-fiction already in a way that it feels like it's not necessarily affecting fiction writers.
So, really interesting to see that fiction writers are more worried about the future than non-fiction writers. That may be that non-fiction writers have just given up already, which would be a really sad state, so it reflects the fact that they feel they're starting from a lower base. Who knows?
It would be interesting to know. It will also be interesting, obviously, to follow through.
When it comes to translators and illustrators, those figures are even higher, they're over three quarters for both, who think they will be affected in the future.
So, that's quite a gloomy picture about people's attitudes and people's feelings and general sense of optimism or, and they're mainly in the or category.
Otherwise, figures on the use of AI are really interesting. It seems that lots of people in creative industries are actually using AI already. They are using AI to, to brainstorm. So, I think 30 percent of people are already using AI to brainstorm ideas, whereas a fifth of creators have actually said they're using generative AI in their work already.
Really interesting would be to see what the intersection is between those groups and the optimists and pessimists. Is it that people are pessimistic and they feel they have to use the tools that are available because otherwise they will sink completely, or is it that the world is full of optimists who are embracing on the one hand and the pessimists who are rejecting on the other, or people who are rejecting on the matter of principle, people who are sensing that they need to be more pragmatic or want to be more pragmatic. It would be fascinating to find out what those intersections are.
UK Government AI Update Favours Regulators Over Creators
I'd say a gloomy picture from creatives, but then we also had the UK governments update from the Competitions and Markets Authority. Absolutely thrilling sounding government organization.
It's their update on potential antitrust issues around AI and I have to say, reading it, because I did read it, it's actually only 24 pages, which for a government body update is quite short. So, I read it all, and I have to say, creators are probably quite right to be quite gloomy because it's interesting to see what's mentioned. It's also really interesting to see what's not mentioned, and one of the things that's not mentioned is copyright.
Indeed, I think actually people who write words, who draw pictures, who sing songs, feel like they are a bit absent from this report on the future of what they're calling foundation models. So, these are these big generate generative AI models.
Everything is about what the government would call data owners and data owners, this isn't a GDPR definition of data owners, this is data owners in terms of the AI industry, and those data owners are the people who are in the training models, basically.
It's really quite depressing to see how much the people who actually came up with the words in the first place have been erased from that picture. The paper says, essentially, there is a danger that big tech companies are going to monopolize the AIs of the future, and in order to prevent that, we need to have a freer market.
So, this is part of the UK government's attempt to position itself as the friend of big tech, which has also had the rather obvious side effect of making them come across as not so much the friend of creators. This very much feels as though they're pitching themselves in opposition to the European Union with its AI act that clearly positioned itself on the side of publishers, content creators, rights holders, copyright holders and said, no, you can't regulate yourself to the tech industries, we are going to regulate you, and you are going to have to be transparent, and you are going to have to compensate people appropriately for using their work.
The UK government, everything about regulation is about how the industry needs to align itself to the government's principles, and that of course is shorthand for self-regulation. So, what the European Union rejected, seems to be what the UK government is proposing, which sort of feels like it's recent history playing itself out again.
The regulation, or the self-regulation in particular, is focused on ensuring that, as they put it, consumers would get access to as many different AI models as possible. They don't want one AI model to gatekeep data.
Again, the way I am reading that is, if your work is available to one AI model, the government want it to be available to all of them. So, this sort of principle of transparency and consent, the kind of contracts we're signing that might give us the ability to say we want our work to go here, but not there, that kind of control for creators is going to be, if the government has their way, eroded because it will be seen as anti-competitive, because it will mean that consumers will have fewer AI models to go to who have been trained on data that's as broad as possible.
So, it all feels rather up in the air, somewhat worrying. I'm not a pessimist by nature. I'm certainly not a tech pessimist by nature. I'm considering myself a tech pragmatist, but this feels as though the government isn't necessarily on the side of creators, and something might need to be done about that. It would be good to see a concerted effort to point out some of the holes in this plan. It's probably a nice way to sum that up.
So, I will leave that there. There's a link in this week's news to this fascinating 24-page report. Do go and read that. Absolutely go and read the Society of Authors survey, because that really is interesting.
I will, of course, update you on everything that happens in this very fast-moving situation, and hope to bring you some more positive and less AI-related news next week. Thank you.