Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Microsoft is giving Copilot users access to GPT-4-Turbo for free (tomsguide.com)
75 points by tosh on March 17, 2024 | hide | past | favorite | 44 comments


Get users hooked with free access. Wait few years. Start having lawyers tweak terms of service. Charge corporate customers a shit ton of money as free users sell their junk. Yank “free” tier, with none to minimal notice due to ToS. Cite limited resource constraint. Stick previously free users with $99/month access.

MSFT stock pumps hard for the next few years.


The fact that MS is doing this could be a sign that most people just don't find much utility in these AI tools.

Like you said, they're trying to get people hooked.

But if AI isn't any useful to people in the first place, this goes nowhere and they're still stuck with a low/decreasing amount of users.


Or perhaps “chat” just isn’t the best use of LLMs. Autocomplete-based integrations, rather than chat, could be better. I know I love and get great utility from GitHub Copilot, but not really any online or offline LLM chatbot.

Also, from a UX perspective, I feel an autocomplete-based UI encourages me to make sure the thing “I’m saying” is actually what I mean to say - ie, to not implicitly trust LLM hallucinations.

LLM-based generative text everywhere, in every app, could be useful, very straightforwardly integrated at an OS level into all text controls, something MS is well-poised to do, and “Copilot” a very good brand for that. (And I would be shocked if Apple and Google aren’t working on the same.)


Replying to myself to note that it looks like Google just started rolling this out in Chrome under the name "Help Me Write": https://support.google.com/chrome/answer/14582048?visit_id=6...


I use llms to write text for me sometimes simply to rewrite it myself because sometimes having something to criticise is better than having to create from scratch


Given the relative simplicity of integrating LLM autocomplete into OS text fields (compared to other integrations supported in modern OSes), I would be shocked if a state entity with balls (like the EU) doesn’t introduce laws enforcing that OSes must have open support for third-party LLM/generative text integrations.

That said, that’s almost certainly Microsoft’s short-term strategy until then! And probably part of why Apple made their sudden priority shift towards LLM features.


The idea is to introduce advertising into the responses—-subtly. Which will be extremely powerful.


In a few years competitors to gpt-4-turbo will exist so this sounds like a rather foolish plan.


They don’t need to tweak them, the terms are already totally fucked, they are explicitly anticompetitive terms.


Wish there was a way to alter the copilot’s system prompt - if so, I could just cancel ChatGPT and use Copilot only.


And feed stuff into it via an API. I always like to automate stuff. I guess you could use power automate but I'm not really big on the MS ecosystem (despite being part of a team that owns it at work)


these commercial offerings are always going to have mystery meat attached to them.

just either accept or reorient to selfhosting


hedge your bet and do both


I just don't think the stability of cloud air will allow a both thing. There's still going to be so muchbtailoring that youvhavr to pick a path


Is this the same as the Copilot that is in VSCode via the Extensions?

It's possibly confusing, or just very simple. I'm not sure :-D


Microsoft it currently renaming all chatbots and direct LLM-backed applications "copilots". It makes it very difficult to navigate their offerings.

But I think it is kinda similar to how OpenAI is trying to call everything backed by their models a "GPT"


> It makes it very difficult to navigate their offerings.

Microsoft has always done stuff like this. For instance, it was hard to know if you needed Windows Deluxe Professional Edition, Windows Master Programmer Developer Edition, or Windows Demigod Master of the Universe Edition (those might not be the exact names :-)

I suspect that the goal there was to confuse people enough that the wealthy bought the most expensive one just so they were sure they "got everything", while not losing the sale to the poor schlub who could only afford Windows Broke-ass Fool You Suck Edition.


I believe Microsoft reused the Copilot brand that it acquired from GitHub for its AI assistant which is embedded in Windows, Bing, and apparently now Office. GitHub Copilot also still exists but is a different product.


"Copilot" is fast becoming what "dot Net" was to their branding.


What a f$&cking headache that is.


> I believe Microsoft reused the Copilot brand that it acquired from GitHub

Microsoft didn’t acquire Copilot from Github, Microsoft introduced it at Github long after Github was part of Microsoft.


> Is this the same as the Copilot that is in VSCode via the Extensions?

This is about Microsoft Copilot – https://copilot.microsoft.com/ – not Github Copilot – https://github.com/features/copilot

There are a lot of “Copilot” extensions for VSCode, including Github Copilot, but none for Microsoft Copilot, AFAIK.

> It’s possibly confusing,

No, after Github Copilot being around for a while, Microsoft rebranding everything that is AI assistants “Copilot” (with “Microsoft” in some contexts, but often just bare) is confusing, especially since it follows lots of other projects adopting the X Copilot thing in between “Github Copilot” becoming popular and Microsoft introducing their new Copilot branding.


Given the ongoing issues with ChatGPT (slow and _incredibly_ lazy) and all the copilot nonsense, as well as my growing aversion of how MS are muscling in on this space, I’m about to make a wholesale switch to Claude for my API and chat access


I recommend mistral because you can use it for AI development while Anthropic explicitly forbids use to develop AI.


you get chatgpt-4-turbo, claude 3, and mistral large all through a kagi ultimate subscription but it's still in beta.


The better version of Claude 3?


بياس اي فوت بول



i wasn't aware there was a free tier of Copilot, from reading the article it sounds like an entirely different product than Github Copilot, right?


It used to be called Bing Chat (aka "Sydney"), if that rings a bell. It made headlines last year for its creepy behavior.

https://fortune.com/2023/02/21/bing-microsoft-sydney-chatgpt...


Yes, totally different. If you use windows, there’s a pane in their Edge browser that opens up an AI chat window. You can use it for most of the things you get with paid OpenAI subscription, but it’s free for every windows user. They call it Copilot.


There is also a Copilot app that you can use from your phone.


The one named bing ? Or is there another one

Edit how lazy is the question, found the stand alone app


Haha, it’s a standalone, I was too lazy to attach links, but for the next person stumbling over this:

iOS: https://apps.apple.com/se/app/microsoft-copilot/id6472538445... Android: https://play.google.com/store/apps/details?id=com.microsoft....


Not only on windows. You can use it on Edge for Mac.


TIL!


it's also baked into windows 11 (insider's build maybe?) as a taskbar button


Most recent major feature release has it now (I don’t know what they call them anymore) but it showed up in my taskbar after an update maybe just over a month ago.


Yes. "Copilot" is just chatbot, now. The copilot that helps you code is now called "Github Copilot". I guess Microsoft couldn't leave a good name alone.


بياس


Foreboding of GPT-4.5 release?


I suspect that it's got to do with excess availability of compute. Usage hasn't grown past the initial novelty wave and enough enthusiast programmers are recently talking about how Claude 3 performed better in their usage.


Competing with Gemini being rolled out into Google search.

It’s gonna be ruthless between Google, Apple and Microsoft.


اليسون




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: