[ad_1]

OpenAI says that ChatGPT’s memory is opted-in by default, which means the user must actively turn it off. Memory can be cleared at any point, either in settings or by instructing the bot to clear it. After the memory configuration is cleared, this information will not be used to train its AI model. It is unclear how much of this personal data is used to train the AI. While Someone is chatting with a chatbot. And toggling the memory doesn’t mean you’ve completely opted out of OpenAI’s model to train your chats. This is a separate opt-out.

The company also claims that it will not store some sensitive information in memory. If you tell ChatGPT your password (don’t) or Social Security number (or this), the app’s memory is thankfully forgotten. Jung also says that OpenAI is still seeking feedback on whether other personally identifiable information, such as a user’s race, is too sensitive for the company to automatically capture.

“We think there are many useful cases for this example, but currently we’ve trained the model to avoid actively remembering this information,” Jung says.

It’s easy to see how ChatGPT’s memory function can go awry—instances where a user may have forgotten that they once contacted a chatbot to deal with a kink, or an abortion clinic, or a nonviolent mother-in-law. Ask about the method, remind it or ask others to see it in a future chat. How ChatGPT’s memory handles health data is also an open question. “We prevent ChatGPT from remembering some health details but it’s still running,” says OpenAI spokesperson Nico Felix. In a way ChatGPT is the same song, in a whole new age, about the permanence of the Internet: check out this great new memory feature, unless it’s a bug.

OpenAI isn’t even the first to toy with memory in generative AI. Google has emphasized “multi-turn” technology. Gemini 1.0, his own LLM. This means you can use a single-turn prompt with Gemini Pro — a back-and-forth between the user and the chatbot — or a multi-turn, continuous conversation in which the bot “repeateds the context of previous messages.” Remembers”.

An AI framework company called LangChain is developing a memory module that helps large language models remember previous interactions between the end user and the model. Giving LLMs a long-term memory “can be very powerful in creating unique LLM experiences—a chatbot can start tailoring its responses to you based on what it knows about you as an individual. It can,” says Harrison Chase, co-founder and CEO of LangChain. “Lack of long-term memory can also make for a great experience. No one wants to have to repeatedly tell a restaurant recommendation chatbot that they’re vegetarian.

This technology is sometimes referred to as “context retention” or “persistent context” rather than “memory,” but the end goal is the same: to make human-computer interaction so fluid, so natural, That the user can easily forget. What a chatbot can remember This is also a potential boon for businesses deploying chatbots that may want to maintain an ongoing relationship with the customer on the other end.

“You can think of them as just a number of tokens that are being added to your conversation,” says Liam Fides, a research scientist at OpenAI. “The bot has some intelligence, and behind the scenes it’s looking at the memories and saying, ‘It looks like they belong; let me match them.’ And then it goes to your token budget.

Feds and Jung argue that ChetGPT’s memory is nowhere near the capacity of the human brain. And yet, almost in the same breath, Feds explains that with ChatGPT’s memory, you’re limited to “a few thousand tokens.” If only.

Is this the hypervigilant virtual assistant that tech users have been promised for the last decade, or just another data capture scheme that uses your likes, preferences, and personal data to get a tech company better than its customers? How to do it? Possibly both, though OpenAI might not put it that way. “I think the assistants of the past just didn’t have the intelligence,” Fedes said, “and now we’re getting there.”

Will Knight contributed to this story.

[ad_2]