Close Menu
Breaking News in Technology & Business – Tech Geekwire

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    The Rise of Small Language Models: Enhancing AI Efficiency and ROI

    July 5, 2025

    CMS Announces 6-Year Prior Authorization Program Pilot

    July 5, 2025

    Best Buy Sells Health Tech Startup Current Health

    July 5, 2025
    Facebook X (Twitter) Instagram
    Breaking News in Technology & Business – Tech GeekwireBreaking News in Technology & Business – Tech Geekwire
    • New
      • Amazon
      • Digital Health Technology
      • Microsoft
      • Startup
    • AI
    • Corporation
    • Crypto
    • Event
    Facebook X (Twitter) Instagram
    Breaking News in Technology & Business – Tech Geekwire
    Home » AI’s Price Tag: Tech Giants Shift Costs to Consumers
    Microsoft

    AI’s Price Tag: Tech Giants Shift Costs to Consumers

    techgeekwireBy techgeekwireMarch 3, 2025No Comments5 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email

    AI’s Price Tag: Tech Giants Shift Costs to Consumers

    After a year of aggressively integrating generative AI into their core products, tech giants like Microsoft and Google are now grappling with a harsh reality: making their AI investments pay off. The initial enthusiasm is giving way to a more pragmatic approach, characterized by price hikes, the introduction of advertising, and other strategies that effectively shift the financial burden onto consumers.

    Is this a sign that the AI boom is losing steam? The situation is more complex than a simple reversal. While the commitment to AI remains unwavering, companies are struggling to monetize the technology. Their response? To find subtle, often less obvious ways, to make consumers foot the bill.

    Shifting the Costs

    Last week, Microsoft quietly scaled back its planned data center expansions. Concurrently, the company increased subscription prices for its flagship 365 software by up to 45% and introduced an ad-supported version of some products. Additionally, Microsoft CEO Satya Nadella has recently downplayed the tangible value generated by AI so far.

    These actions might seem counterintuitive in the face of the current AI frenzy, especially when contrasted with the splashy announcements from companies like OpenAI regarding their $500 billion Stargate data center project. However, a closer look reveals that these moves do not indicate a retreat from AI. Instead, Microsoft is adapting its strategy to make AI profitable by subtly shifting costs onto consumers.

    The High Cost of Generative AI

    Generative AI is, undeniably, expensive. OpenAI, a market leader with approximately 400 million active monthly users, is incurring massive losses. Last year, the company generated US$3.7 billion in revenue but spent nearly US$9 billion, leading to a net loss of about US$5 billion.

    OpenAI CEO Sam Altman has stated that the company is losing money on its US$200 per month ChatGPT Pro subscriptions. Microsoft, OpenAI’s largest investor and cloud computing provider, also indirectly bears the brunt of these expenditures.

    What factors contribute to the high cost? Beyond human labor, there are two critical expenses associated with AI models: training (building the model) and inference (using the model). While training is a substantial upfront investment often involving significant resources, inference costs escalate with the user base. Furthermore, the larger and more sophisticated the AI model, the more expensive it becomes to operate.

    Searching for Cheaper Workarounds

    A single query on OpenAI’s most advanced models can cost up to US$1,000 in compute power alone. In January, CEO Sam Altman admitted the company’s $200 per month subscription is not profitable. This suggests a significant financial shortfall, not just related to the free models, but also with their premium subscription services.

    Both training and inference rely heavily on data centers, which pose significant cost factors. The specialized chips required for operation are exceedingly expensive and the costs for electricity, cooling, and hardware depreciation are substantial. Tech companies face a growing problem in recouping these costs associated with running data centers to power generative AI products.

    To date, innovation in AI has largely been a matter of scale. OpenAI describes its newest model as “a giant, expensive model”. There are signals that this relentless pursuit of scale may not be necessary. DeepSeek, a Chinese company, created comparable models for just a fraction of the traditional training expense. Researchers at the Allen Institute for AI (Ai2) and Stanford University claim to have trained a model for as little as US$50.

    Essentially, big tech AI might not be profitable due to the expense of building and maintaining the necessary data centers.

    Microsoft’s Strategy

    Having invested billions into generative AI, Microsoft is focusing on a viable business model to turn the technology profitable. Over the past year, the tech giant has integrated its Copilot generative AI chatbot into its products. Today, it is impossible to purchase a Microsoft 365 subscription without Copilot. Consequently, subscribers are experiencing significant price increases.

    The cost of running generative AI models in data centers is substantial. Microsoft is attempting to move more of the processing burden onto users’ devices – where the user is responsible for device-specific hardware and operational costs. Microsoft highlights that Copilot’s key will enable people to participate in the AI transformation.

    As evidence of this strategy, Microsoft has integrated a dedicated Copilot key onto their devices. Apple is taking a similar approach, focusing on on-device AI processing rather than cloud-based services. Apple’s new devices offer AI capabilities along with the promise of consumer data privacy benefits.

    Pushing Costs to the Edge: Opportunities and Tradeoffs

    There are key advantages to doing generative AI inference on personal devices, including phones, laptops, and smartwatches – a concept known as “edge computing.” It can mitigate the costs of data centers, diminishing the environmental impact related to resources, waste, heat, and water use, which can lower AI’s carbon footprint. It may decrease bandwidth requirements and improve user privacy.

    Edge computing can reduce the energy, resources and waste of data centres
    Edge computing can reduce the energy, resources and waste of data centres

    However, edge computing introduces its own set of challenges. Edge computing shifts computation costs to consumers, which can drive demand for new devices, despite concerns regarding environmental impact and economic factors. This shift could intensify with newer, more demanding generative AI models, potentially creating more electronic waste. Furthermore, if the use of a certain type of AI is based on a device, it has the potential of creating a digital divide for users, especially in educational settings. Despite the appearance of a more “decentralized” approach, a scenario might emerge where a handful of companies control the transition, diminishing the value promised on the shift to decentralization. The financial and environmental cost impacts of these evolving methods are noteworthy.

    As the expenses surrounding AI infrastructure rise and model development advances, shifting costs to consumers has become an increasingly attractive strategy for AI companies. While bigger organizations such as universities or governments may manage these costs, most consumers and small businesses may find this increasingly challenging.

    Artificial Intelligence consumer costs data centers Edge Computing Google Microsoft OpenAI
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    techgeekwire
    • Website

    Related Posts

    The Rise of Small Language Models: Enhancing AI Efficiency and ROI

    July 5, 2025

    CMS Announces 6-Year Prior Authorization Program Pilot

    July 5, 2025

    Best Buy Sells Health Tech Startup Current Health

    July 5, 2025

    Modernizing Government through Technology and Institutional Design

    July 5, 2025

    Proposed ‘Frontier Valley’ Tech Zone Planned Near San Francisco

    July 5, 2025

    L.A.’s Thriving Crypto VC Scene: A Shift Towards Mainstream Acceptance

    July 5, 2025
    Leave A Reply Cancel Reply

    Top Reviews
    Editors Picks

    The Rise of Small Language Models: Enhancing AI Efficiency and ROI

    July 5, 2025

    CMS Announces 6-Year Prior Authorization Program Pilot

    July 5, 2025

    Best Buy Sells Health Tech Startup Current Health

    July 5, 2025

    Modernizing Government through Technology and Institutional Design

    July 5, 2025
    Advertisement
    Demo
    About Us
    About Us

    A rich source of news about the latest technologies in the world. Compiled in the most detailed and accurate manner in the fastest way globally. Please follow us to receive the earliest notification

    We're accepting new partnerships right now.

    Email Us: info@example.com
    Contact: +1-320-0123-451

    Our Picks

    The Rise of Small Language Models: Enhancing AI Efficiency and ROI

    July 5, 2025

    CMS Announces 6-Year Prior Authorization Program Pilot

    July 5, 2025

    Best Buy Sells Health Tech Startup Current Health

    July 5, 2025
    Categories
    • AI (2,699)
    • Amazon (1,056)
    • Corporation (991)
    • Crypto (1,132)
    • Digital Health Technology (1,082)
    • Event (526)
    • Microsoft (1,230)
    • New (9,583)
    • Startup (1,167)
    © 2025 TechGeekWire. Designed by TechGeekWire.
    • Home

    Type above and press Enter to search. Press Esc to cancel.