NEW YORK, March 16 — One month after unveiling its technology, still in the testing phase, OpenAI, has indicated that Sora will be available to the public later this year. After ChatGPT for generating text and Dall-E for generating images, OpenAI is now looking to fine-tune its new artificial intelligence tool capable of generating videos.

The American start-up caused yet another sensation last month, when it presented a new form of video-generating artificial intelligence called Sora. Using a simple prompt of just a few words, the tool is capable of proposing a video sequence lasting several seconds, generated entirely automatically. Examples shown featured landscapes as well as animals or even human beings. While it's rather impressive to watch, none of the images are “real.”

In an interview with the Wall Street Journal, Mira Murati announced that this easy-to-use technology will be made available to the public before the end of 2024, although precise details about subscription formats or an exact launch date were not given. There are still many areas for improvement before the tool can be deployed on a large scale: the tool sometimes lacks fluidity and may have difficulty understanding certain cause-and-effect relationships. For example, a person may bite into a cookie, but afterwards, the cookie may have no bite marks. The tool also needs to be less resource-intensive, Murati revealed. It won't go live until it's properly optimized to requires minimal computing power.

The company is also planning to build in certain limitations and restrictions. Sora will reject requests that are too violent and will not generate hateful images or sexual content. OpenAI also promises not to use images of celebrities. This is fundamental, as the availability of this type of tool also raises ethical issues, at a time when fake news is on the rise. Indeed, with Sora potentially launching in the midst of the American presidential election campaign, OpenAI needs to integrate safeguards to prevent its tool from being used for disinformation. — ETX Studio

Advertisement