Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am just curious. Please explain it to me.

1. Who are beginners? All of these concepts are so apparent to most of the grad students/those following this scene extremely closely, yet they can't find a job related to it. So does it make them beginners?

2. These are such a generic use cases that don't define anything. It is literally software engineering wrapped around an API. What benefit does the "beginner" get?

3. So are these biased to some exceptionally talented people who want to reboot their career as "GenAI" X (X = engineer/researcher/scientist)

4. If there are only open positions in "generative AI" that requires PhD, why are there materials such as this? Who is it targeted to and why do they exist?

5. Most of the wrapper applications have short life-span. Does it even make sense to go through this?

6. What does it mean for someone who is entrenched into the field? How are they going to differentiate from these "beginners"?

7. What is the point to all of this when it is becoming irrelevant in next 2 years?



I don't think this course is for machine learning grad students, I think Microsoft is trying to create materials for someone interested in using ML/AI as part of developing an application or service.

I've only skimmed the course here, but I do think there's a need for other developers to understand AI tooling, just as there became a need for developers to understand cloud services.

I support those building with any technology taking the time to understand the current landscape of options and develop a high mental model around how it all works. I'll never build my own database engine, but I feel my learnings about how databases work under the hood have been worth the investment.


I've been finding the recently coined term "AI engineer" useful, as a role that's different from machine learning engineering and AI research.

AI engineers build things on top of AI models such as LLMs. They don't train new models, and they don't need a PhD.

It's still a discipline with a surprising amount of depth to it. Knowing how best to apply LLMs isn't nearly as straight forward as some people assume.

I wrote a bit about what AI engineer means here: https://simonwillison.net/2023/Oct/17/open-questions/


So in a similar vein as, data engineers being people who USE things like Redshift/Snowflake/Spark/etc., but are distinct from the category of people who actually build those underlying frameworks or databases?

In some sense, the expansion of the role of data engineering as a discipline unto itself is largely enabled by the commoditization of cloud data warehouses and open source tooling supporting the function of data engineering. Likewise, the more foundational AI that gets created and eventually commoditized, the more an additional layer of "AI engineers" can build on top of those tools and apply them to real world business problems (many of which are unsexy... I wonder what the "AI engineer" equivalent unit of work will be, compared to the standard "load these CSVa into a data warehouse" base unit task of data engineers).


* Fine tune this prompt/prompt chain for less bias.

* Fine tune this prompt/prompt chain to suggest X instead of Y.

* A/B test and show the summarized results of implementing this LoRA that our Data Engineer trained against our current LLM implementation.

* A/B test and show the summarized results of specific quantization levels on specific steps of our LLM chain.

All of with requires common sense, basic statistics and patience instead of heavy ML knowledge.


It seems to me that this course introduces Python devs to building gen text applications using Open AI's models on Azure. And I don't mind it - some folks will find it useful.


The point is to hook people who want to “do AI” into Microsoft’s cloud API ecosystem.


1. Seems like regular software devs who want to try making AI stuff.

2-6 seem like leading questions, so I'll skip them, but:

7. Because you can make fun stuff in the meantime!


You give it to intern and report to higher ups that there is now "Generative AI" used in your company. Higher ups tell their friends while golfing. Everyone is happy, until their entire industry gets disrupted by actual AI specialists.


I'm not entirely sure that all GenAI positions are for people with Phds. Nick Camarata seems to be a researcher at Open AI appears doesn't even have BsC.


In those 2 years head start you can have users and collect excellent data that will make your AI app better than competition.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: