Making ChatGPT Cheaper: Microsoft Aiming to Cut Costs by Over $700,000 Daily

Date:

Using an advanced AI technology like ChatGPT could prove an extremely costly venture for OpenAI. According to an analyst from SemiAnalysis, Dylan Patel, ChatGPT could cost at least $700,000 per day. Patel noted that ChatGPT needs immense computing power on expensive servers to answer queries. The cost so far is based on OpenAI’s GPT-3 model. If GPT-4 were used the costs would be even higher.

In an effort to reduce the costs of using GPT technology, Microsoft is reportedly working on a covert AI chip called Athena. Athena is set to be used internally by OpenAI and Microsoft and could be released as early as next year. Additional reports suggest that around 300 people are currently working on the project which began back in 2019. The whole idea behind the project is to allow Microsoft and OpenAI to access their own AI chips at a cheaper cost compared to using GPUs.

AI is still a relatively new field and the surge in demand for AI solutions has led to many companies investing large amounts into their technology. For example, a startup called Latitude has reportedly been spending $200,000 a month on AI and Amazon Web Services servers. To tackle the issue, Latitude decided to switch over to language software provided by AI21 Labs which successfully reduced the amount to $100,000 a month.

The enterprise mentioned in this article is OpenAI, a San Francisco-based research laboratory focused on developing human-level artificial intelligence. It was founded in 2015 by entrepreneur Elon Musk and other scientists. Its goal is to develop and promote artificial general intelligence (AGI) to improve human decision making. OpenAI has received investments from a range of venture capital firms and industry leaders such as Microsoft and Amazon.

See also  AI Transforming Homelessness: Project PRC Assists 1000 Families with ChatGPT

The person mentioned in this article is Dylan Patel, chief analyst at semiconductor research firm SemiAnalysis. He has provided insight into the costs of running the OpenAI GPT-3 and GPT-4 models. His initial estimates are based on GPT-3 and he notes that the costs of GPT-4 could be even higher. Patel and his colleague at SemiAnalysis, Afzal Ahmad, have pointed out that the operational costs, or inference costs, of running ChatGPT far exceed the training costs.

Frequently Asked Questions (FAQs) Related to the Above News

Please note that the FAQs provided on this page are based on the news article published. While we strive to provide accurate and up-to-date information, it is always recommended to consult relevant authorities or professionals before making any decisions or taking action based on the FAQs or the news article.

Share post:

Subscribe

Popular

More like this
Related

Obama’s Techno-Optimism Shifts as Democrats Navigate Changing Tech Landscape

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tech Evolution: From Obama’s Optimism to Harris’s Vision

Explore the evolution of tech policy from Obama's optimism to Harris's vision at the Democratic National Convention. What's next for Democrats in tech?

Tonix Pharmaceuticals TNXP Shares Fall 14.61% After Q2 Earnings Report

Tonix Pharmaceuticals TNXP shares decline 14.61% post-Q2 earnings report. Evaluate investment strategy based on company updates and market dynamics.

The Future of Good Jobs: Why College Degrees are Essential through 2031

Discover the future of good jobs through 2031 and why college degrees are essential. Learn more about job projections and AI's influence.