Meta is opening up the possibility for anyone in the US to create AI versions of themselves on Instagram or the web, with a new tool called AI Studio.
The proposal is that creators and business owners will use these AI profiles to speak to their followers on their behalf. They will be able to talk directly to humans in chat conversations and respond to comments on the author’s account. Meta says US Instagram users can start using AI Studio through your website or starting a new “AI chat” directly on Instagram.
In a blog post on Monday, the company writes that “creators can customize their AI based on things like their Instagram content, topics to avoid, and links they want to share.” The post goes on to say that creators will be able to toggle things like automatic responses from their AI and dictate which specific accounts are allowed to interact with.
AI Studio also allows you to create completely new AI characters that can be deployed in Meta applications. Here, Meta comes after startups like Character.AI and Replika, where people are already talking to – and even falling in love with – themed chatbots. Just like OpenAI’s custom GPT store, Meta will also feature AI characters that people create for others to try out.
Meta’s first attempt at this concept was to have a handful of celebrities create AI versions of themselves with the same likeness but different names and personas. At the time, Meta said it took this approach because it was concerned about AI versions of celebrities saying problematic things on behalf of their human counterparts. (Even with AI Studio’s built-in controls, this is still bound to happen. We’re dealing with generative AI, after all.)
It seems Meta is at least aware that this is risky territory. The company claims that AI profiles are clearly identified everywhere they appear. A company manual for creators goes into more detail about the AI creation process, and it seems like it’s up to the creator to list the topics that an AI won’t get involved in. One of the examples of Meta questions that an AI can be instructed not to answer: “Should I invest in crypto??”