Microsoft advertisement than the new Bing with ChatGPT uses a customized version of GPT-4 optimized for search engines. The technology company took advantage of the OpenAI announcement to confirm what many had suspected for weeks. The new AI model can solve difficult problems more accurately, thanks to broader general knowledge and improved skills, its creators said.
“If you’ve used the new Bing preview at any time in the last five weeks, you’ve already experienced a preview of this powerful model,” said Yusuf Mehdi, director of Search and Devices at Microsoft. “As OpenAI makes updates to GPT-4 and later, Bing benefits from those improvements,” he stated. The manager mentioned that updates made by Microsoft engineers are added to the model to ensure that users have the most complete version.
GPT-4 can handle more than 25,000 words of text, which translates into generation of longer content and longer conversations. According to OpenAI, the new model is 82% less likely to respond to requests for disallowed content and 40% more likely to produce objective responses, compared to GPT-3.5.
The integration of GPT-4 in the Microsoft browser would make it a powerful assistant, or “co-pilot”, as the technology company calls it. Bing could transform web searches into a conversation with a more “human” AI that makes fewer mistakes. After analyzing the behavior and polishing the rules, users would chat for a long time without the ChatGPT AI losing its mind.
Bing’s secret weapon will be image recognition
Among the improved features we could see in future versions of Bing with ChatGPT is document parsing. One of the features of GPT-4 is that can translate and improve writing by looking for misspellings and grammatical errors. The AI is also more creative than the previous version, so it has a refined sense of humor and is capable of composing better songs or even writing movie scripts.
One of the most important capabilities of GPT-4 is image recognition. The new version of the model accepts images as inputs and can generate captions, ratings, and analysis. In practical terms, users could enter a picture of vegetables or meat and ask Bing to tell them what dishes they can cook.
A few weeks ago, Microsoft announced that they were working on the development of a model capable of extracting text from images or solving puzzle. Kosmos-1 can answer questions about images, perform simple math operations, and recognize text and numbers. However, the most interesting thing about this AI is its performance in a test that measures human intelligence and abstract reasoning.
Although Kosmos-1 has no relation to OpenAI and GPT-4, it is clear that Microsoft is interested in the development and future integration of a multimodal model. The technology company confirmed that it will hold an event on artificial intelligence on March 16. It will share all the details about how AI will benefit your upcoming products and services.
How to use GPT-4 in Bing with ChatGPT
Until that day arrives, those interested in using GPT-4 on Bing with ChatGPT can sign up for the preview phase. Setting Edge as the default browser and Bing as the default search engine will help skip a few places in line. For now the availability of the final version is unknownalthough if you access the beta, you can also use it on your mobile device.