I think that the UI/UX conflation is inevitable, as developers should have skills in both areas, similarly to how DevOps is becoming more and more popular (though it also has its opponents). I'd argue that being able to write your code but not knowing how it runs isn't entirely viable, since one might influence the other - same goes for UI and UX. Both should be considered, even if i'd prioritize UX above UI.
> You are reducing this whole academic field that produces UX designers with the like of UI/visual/graphic designers who never studied or learned how to evaluate designs and study how people use designs (usability testing), which would TRIVIALLY catch and solve the issue in the OP of a severe usability issue with the V11 Tesla UI redesign.
Apart from bringing shortcomings in our educational systems to light (e.g. the pace at which the industry moves ahead and the inability of many academic programs to keep up with it), i'm not sure i can agree with the entirety of the argument, because it almost reads like the "No true Scotsman" argument: https://en.wikipedia.org/wiki/No_true_Scotsman
Should more academic knowledge be applied to the industry? Sure! But since that's clearly not happening, it's useful to explore why that is: in my experience, much of the academic research and ideas (model driven development in particular or many approaches to testing software come to mind) never find their way to being applied in the industry, don't get turned into viable products or even practices. Thus, i'd say that it's not like this academic field is maliciously being reduced in that argument - it has simply failed to make itself relevant as a whole, one that would be worthy of consideration (even if it should). That's how we get these UX messes.
Sure, one can talk about outsourcing and how these consulting companies are responsible for the end results, but the blame also lies in the businesses who don't want usable software in 5 years but good enough software in 1 year. Can you really blame them for trying to remain competitive and cutting out all of the non-essential factors just to be the first to market and be able to iterate quickly? I think i can, so perhaps it's useful not to blame some underpaid person who's under the pressure of made up deadlines, but rather the company that wanted to go down this path (both the client and supplier), as well as the whole economic system as a whole. In my eyes it's not that dissimilar to how everyone wants to hire experienced devs, yet no one wants to invest in new devs.
The tone in your comment rubs me the wrong way, maybe because it seems like gatekeeping or elitism. Clearly if the industry was better aligned with sustainability and a friendly approach to HCI, these good patterns would emerge naturally: if a compiler can point out problems with code or plugins do the same with code style, or even Lighthouse tell you about potential improvements to the page structure, why don't we have the same for accessibility and everything else?
Just look at Flutter, something that was touted as a great advancement in interface design, yet breaks not only screen readers (at least up to recently), but also cripples basic browser functionality like right clicking: https://gallery.flutter.dev/#/
Look at "The Website Obesity Crisis", a trend that plagues the entirety of the industry and seems to be like the front end equivalent of the Wirth's law: https://idlewords.com/talks/website_obesity.htm (which also coins the term "Chickenshit Minimalism", as funny as it is)
I don't think the solution here is to point fingers at one another and to claim that person X isn't a true developer because they didn't ignore the economical climate that they are in to pursue creating the perfect UX, or that person Y doesn't fit the criteria either because they didn't rewrite their codebase in Rust with 100% code coverage, or even that they didn't attend an academical institution for an arbitrary amount of years.
As someone with a Master's Degree in CS with the added qualification of a Programming Engineer, i feel like a lot of the things that i learnt were due to my own initiative and caring about it all, rather than some formalized study course. And even so, i sometimes find myself making tradeoffs or sub-optimal choices (albeit well described ones and clearly logged as technical debt) because i enjoy having food on the table, so to speak.
I think we should start with the realization that many industry trends are actively harmful, that you can only work against them in a limited capacity and have to make the best of it: http://www.stilldrinking.org/programming-sucks
Disclaimer: not downvoting your comment, because while i disagree with a part of your argument, i think it's worthy of discussion.
Thanks all great points and I think I was trying to match the tone of the parent comment, sorry if it comes off sounding arrogant or gate-keepy that was never my intention.
The elitism you sensed was just me struggling to articulate my frustration of going through the rigorous academic route, yet in this industry there is a plague of graphic/10-week UX bootcamp designers who are polishing portfolios and getting hired above junior levels and thus dragging down the field of UX design when they make deleterious design decisions in a user interface and are either uninterested in or too inexperienced to run proper usability testing or UX research methods to evaluate unintended/harmful designs.
“The market seems to want generalists, because they seem cost effective, and because most operations are too small to support a team of specialists,” says Steve Krug, a veteran usability specialist and author of the influential book Don’t Make Me Think. “But I think it’s pretty hard to be really good at more than one of the many subspecialties. It’s a conundrum.”
The skillset to understanding user mental models and behavior is a key component of being a great UX designer and allows you to illuminate where harms exist in design and that can be mitigated when design matches user mental models on a user population representative level. Having this skillset and design practice would have easily caught the issue presented in the OP regarding confusing button schemes in the Tesla V11 UI update.
The fact that it was pushed out to Tesla drivers without nuanced user testing by UX professionals (UX researchers if we break it down into specialization), says a lot about the design team at Tesla, or lack thereof. All this has nothing to do with the developers who are simply taking these designs and implementing them, who are typically overworked and focused on delivering features driven by success of measured KPIs.
> The elitism you sensed was just me struggling to articulate my frustration of going through the rigorous academic route, yet in this industry there is a plague of graphic/10-week UX bootcamp designers who are polishing portfolios and getting hired above junior levels and thus dragging down the field of UX design when they make deleterious design decisions in a user interface and are either uninterested in or too inexperienced to run proper usability testing or UX research methods to evaluate unintended/harmful designs.
You know, i can fully understand why someone would be frustrated with these things! Yet, at the same time, getting a degree didn't prepare me for the realities of working in the industry either - that took further years of work. Perhaps it's a bit like someone expecting to learn C++ in 2 months whereas in reality getting to really know it might take anywhere from 2 years to 2 decades (depending on what you actually want to do, be it write your own compiler, write a physics/game engine, or some low level piece of software that should be bulletproof, vs just a package or two for your own needs).
The first step at getting good at something is to be bad at it - personally, i really appreciated being able to work with microservices in an academical setting and learning about the many ways it can go wrong, but even now, in my day job i am still learning a lot, albeit it takes a lot of care to limit the fallout of mistakes. In this industry, new technologies and methods just never seem to run out, so it's a constant process of learning and churn, sometimes without good reason.
I do agree that figuring out someone's seniority and what they should be entrusted with is difficult and oftentimes nebulous, but perhaps that's just because of the rapid pace of this industry and how much of a "Wild West" it is at the moment.
> “The market seems to want generalists, because they seem cost effective, and because most operations are too small to support a team of specialists,” says Steve Krug, a veteran usability specialist and author of the influential book Don’t Make Me Think. “But I think it’s pretty hard to be really good at more than one of the many subspecialties. It’s a conundrum.”
With this, i might have to concede. I still think that having inter-disciplinary engineers who are competent at everything even if not brilliant at any one particular thing is probably a good idea, but there can definitely be a good argument to make about having specialists. Yet, the financial realities of our world will often force our hand in one particular direction or another.
> The fact that it was pushed out to Tesla drivers without nuanced user testing by UX professionals (UX researchers if we break it down into specialization), says a lot about the design team at Tesla, or lack thereof. All this has nothing to do with the developers who are simply taking these designs and implementing them, who are typically overworked and focused on delivering features driven by success of measured KPIs.
Partially agreed! There are usually "known unknowns" (e.g. "we don't know how this piece of code will interact with that other piece, we should probably set up automated tests to catch them diverging over time") and "unknown unknowns" (e.g. "we probably should have been aware of the UX impact of these changes, which totally escaped our consideration"). Maybe people knew about the UX impact, but didn't want to speak up or be contrarian in that particular environment. Maybe they knew, but just didn't care much, since no one would go to jail for shipping bad UX (at least in the automotive industry in the current year, it would be a different situation in aerospace industry). Or maybe no one even considered it for a variety of factors.
I guess we'll never know, but i agree that it's probably telling of what the priorities were, perhaps being a reflection of the greater trends in our industry. I am yet to see the likes of ADA (https://www.ada.gov/pcatoolkit/chap5toolkit.htm) compliance ever be mentioned as a concern in any of the commercial projects that i've worked on, lest it be explicitly demanded in the design spec. But talking about which front end framework or component library to use? Endless bike shedding: https://en.wikipedia.org/wiki/Law_of_triviality
Great discussion! I think we can both agree that in the end, when we are dealing with people's lives at stake, there should be more scrutiny of both design and code across the board. Hopefully in the future there is a consortium of experts from each respective Government body meeting with tech industry e.g. Department of Transportation meeting with tech industry experts from the likes of ACM Special Interest Groups[1], to create a regulatory framework for releasing safe and responsible tech innovations into transportation and other emerging spaces.
> You are reducing this whole academic field that produces UX designers with the like of UI/visual/graphic designers who never studied or learned how to evaluate designs and study how people use designs (usability testing), which would TRIVIALLY catch and solve the issue in the OP of a severe usability issue with the V11 Tesla UI redesign.
Apart from bringing shortcomings in our educational systems to light (e.g. the pace at which the industry moves ahead and the inability of many academic programs to keep up with it), i'm not sure i can agree with the entirety of the argument, because it almost reads like the "No true Scotsman" argument: https://en.wikipedia.org/wiki/No_true_Scotsman
Should more academic knowledge be applied to the industry? Sure! But since that's clearly not happening, it's useful to explore why that is: in my experience, much of the academic research and ideas (model driven development in particular or many approaches to testing software come to mind) never find their way to being applied in the industry, don't get turned into viable products or even practices. Thus, i'd say that it's not like this academic field is maliciously being reduced in that argument - it has simply failed to make itself relevant as a whole, one that would be worthy of consideration (even if it should). That's how we get these UX messes.
Sure, one can talk about outsourcing and how these consulting companies are responsible for the end results, but the blame also lies in the businesses who don't want usable software in 5 years but good enough software in 1 year. Can you really blame them for trying to remain competitive and cutting out all of the non-essential factors just to be the first to market and be able to iterate quickly? I think i can, so perhaps it's useful not to blame some underpaid person who's under the pressure of made up deadlines, but rather the company that wanted to go down this path (both the client and supplier), as well as the whole economic system as a whole. In my eyes it's not that dissimilar to how everyone wants to hire experienced devs, yet no one wants to invest in new devs.
The tone in your comment rubs me the wrong way, maybe because it seems like gatekeeping or elitism. Clearly if the industry was better aligned with sustainability and a friendly approach to HCI, these good patterns would emerge naturally: if a compiler can point out problems with code or plugins do the same with code style, or even Lighthouse tell you about potential improvements to the page structure, why don't we have the same for accessibility and everything else?
Just look at Flutter, something that was touted as a great advancement in interface design, yet breaks not only screen readers (at least up to recently), but also cripples basic browser functionality like right clicking: https://gallery.flutter.dev/#/
Look at "The Website Obesity Crisis", a trend that plagues the entirety of the industry and seems to be like the front end equivalent of the Wirth's law: https://idlewords.com/talks/website_obesity.htm (which also coins the term "Chickenshit Minimalism", as funny as it is)
I don't think the solution here is to point fingers at one another and to claim that person X isn't a true developer because they didn't ignore the economical climate that they are in to pursue creating the perfect UX, or that person Y doesn't fit the criteria either because they didn't rewrite their codebase in Rust with 100% code coverage, or even that they didn't attend an academical institution for an arbitrary amount of years.
As someone with a Master's Degree in CS with the added qualification of a Programming Engineer, i feel like a lot of the things that i learnt were due to my own initiative and caring about it all, rather than some formalized study course. And even so, i sometimes find myself making tradeoffs or sub-optimal choices (albeit well described ones and clearly logged as technical debt) because i enjoy having food on the table, so to speak.
I think we should start with the realization that many industry trends are actively harmful, that you can only work against them in a limited capacity and have to make the best of it: http://www.stilldrinking.org/programming-sucks
Disclaimer: not downvoting your comment, because while i disagree with a part of your argument, i think it's worthy of discussion.