Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
I asked my students to turn in their cell phones and write about it (technologyreview.com)
120 points by r0n0j0y on Dec 27, 2019 | hide | past | favorite | 111 comments


I don't understand why he decided to do his experiment after the students told him it was because they didn't understand the words in the book.

But anyway, I think some of this is true that things like social media have gone to far in attracting our attention. But also a lot of blame is with the structure of education. Listening to one person for at least an hour with sometimes a hundred other people in the room is not a good way of learning, at least not for everyone.

Also learning to remember to complete a closed book exam is not a useful skill anymore. One of the good things about phones is that almost any information can be found almost instantly almost anywhere. No need to remember formulas or dates. Instead how to find information, verify it, and use it is far more important.


> One of the good things about phones is that almost any information can be found almost instantly almost anywhere.

If you need to constantly look up informations, it means you have no long term knowledge of anything.

Hence you can't verify that piece of information.

> Also learning to remember to complete a closed book exam is not a useful skill anymore

The long term knowledge, AKA "your expertise" kindly disagree.

You can't look up informations when you are at a meeting with the client. You should already know what you're talking about and know the answer to the most probable questions.

Prior knowledge gives you confidence, without building a "memory only" information storage you can't be sure of anything.

What do you do if you disagree? Engage in a google battle? Or, even worse, use Wikipedia as a baseline for the truth?


> If you need to constantly look up informations, it means you have no long term knowledge of anything.

I don’t think this is a very charitable interpretation of what GP was trying to say. Of course you can’t get by being a blind conduit of Google’d/Wikipedia’d/etc information, but you certainly don’t need to (and probably can’t) cram everything you need into your memory for instant recall on demand.

How about a middle ground: memory can be thought more like an LFU cache, where “use” is defined as reliance on explicit details of a concept. For example, I rely daily on programming language syntax and best practices and therefore have them deeply embedded in my mental cache. Other knowledge however, like sorting algorithm implementations, I rarely utilize and probably won’t remember after the next time I’m quizzed on it in e.g. an interview or conversation with someone where I’m trying to sound smart (cache miss).

In many cases where I suspect that I’ll frequently encounter cache misses with a particular piece of knowledge , I often find it better to just cache high level details (useful properties of sorting algorithms) and Google the missing details. That or work with subject matter experts.


> You can't look up informations when you are at a meeting with the client.

This sounds like an uncomfortable kind of meeting. I often find myself saying "we can easily look that up".

Knowing where and how to effectively look things up is a critical aspect of knowledge.


> Knowing where and how to effectively look things up is a critical aspect of knowledge.

It depends on what you need to look up: I don't need to know that the Eiffel Tower is hold together by 2.5 million nuts, bolts and rivets (I looked this one up), but as a western european, I should know that the Eiffel Tower is made from some kind of iron.

OTH though, if I'm the company in charge of maintaining the Tower, I should probably know more about it. So it really depends.


To expand on this, this is not just about verbatim memorizing discrete facts - algorithms, definitions, dates, events, vocabularies, grammar rules, musical notation,... - it's about learning when and how to use or apply them.

Facts become information, information becomes knowledge. Learning is about processing what's in front of you. Becoming aware of facts presented to you in a book or a powerpoint presentation, then reflecting on those facts, turning those over, engaging with them, asking new questions, trying to apply them in different contexts and so on.

As you do that, the entire experience of gaining knowledge also leads to self reflection. That is, learning to know yourself. And others around you. When does it matter to apply what you know? When not? What is relevant to me and to those around me at which point? This is about learning to formulate effective critical argumentation based on what you know. And that involves challenging viewpoints including your own. Context matters.

The latter is a skill. It's something you need to practice if you hope to be any good at it.

Expanding on your example:

I think it's important for engineers to have a proper understanding of the details about nuts and bolts when they are actually building the Eiffel Tower. But in a meeting with stakeholders - the mayor, city officials, banks, sponsors,... - it's also about knowing that they probably don't care about those material aspects. Unless the context changes and those become super important. Like when it becomes appartent that the iron wears out because of changing weather conditions. And even at that point, those stakeholders likely aren't interested in the exact chemical formulae that explain the process unless it's necessary to cover risks and liabilities. They simply want a good, terse explanation from an expert and an assurance whether it can be solved.

Hence why knowing things isn't enough. It's about knowing how and when to apply them effectively.


Yes, being able to offload less relevant, less frequently access and intricate information is very useful.

However, the reason why it is so useful is because it gives you more room to use for the stuff that really matters.

Knowing stuff has much lower latency than looking stuff up and having the ability to quickly correlate multiple relevant facts and draw connections allows for much higher capabilities and productivity.

They key is figuring out which stuff you need to know to be able to quickly build and evaluate mental models and which stuff is best looked up.


If you don't have enough knowledge and subsequent intuition, you won't realize that "fact" needs to be checked. You will be easy to manipulate.

In practice, you see that with non technical managers all the time. You see it when people just throw around smart sounding soundbite about history if you happen to know period well.

Many charizmatic confident people rely on others not checking what they talk about.


intuition != memorized formulas


Intuition = knowing enough to see that "something" does not work out. Knowing enough facts to see that something is inconsistent with then.


If I were in a meeting with you about what you were doing or could do for me or my company and you consistently replied with 'we can easily look that up' I would cease to work with you - I can hire other people that clearly don't know my business or what they are doing for cheaper.


OTOH, if “we can easily look that up” is never said, it’s likely you’re not asking probing questions. Or that the other side is probably more confident than they should be.


If you're in a meeting where no party is challenged by what is being said, then what is the point of having that meeting?


> I often find myself saying "we can easily look that up".

Maybe I need to clarify, with "you can't lookup informations" I meant a meeting (or an exam) it's not the place and moment where you verify if a piece of information is valid or to learn something completely new, it's the place and moment where you put what you already know about something at work.

Knowledge is mainly about recognising patterns, if you know something can be solved with a certain formula you can look it up, it's not easy to keep everything in mind, but you have to already know what to lookup (at least the general domain of the problem)

In my experience if I attend a meeting about (for example) implementing the new company SSO component, I go there assuming we all know what an SSO component is, what it does and that each participant understands the problem for their angle (business, technical, legal etc. etc.)


Facts are freely accessible these days compared to the past. A large part of Education in the past wasn't designed with that in mind.

So Education IS changing to meet that reality but its still a work in progress. Consensus is still developing on what the best route ahead is.

You are right that expertise is only possible through the accumulation and quick recall of facts. That said, the time it takes for an expert in Physics to become an expert Programmer or vice versa is much more than it takes for an expert Physicist to find and collaborate with an expert Programmer. So in a networked society, Educations focus is shifting to collaboration and healthy communication at an early age rather than producing mediocre experts in all subjects which is what past education systems have produced in great volumes.


Also you can't Google what you don't know you don't know.


I think you are conflating the use-cases of an exam with how you are tested in reality.

imo an exam is much more akin to meeting preparation than the meeting itself. I completely agree that you should know what you are talking about in a meeting, but the reason you know what you are talking about is because you spent time beforehand working out the specifics for the expected problems. Everyone knows meetings require preparation time to make sure the things being talked about are verified and accurate.

I actually wouldnt really want to talk to someone who's going to try to do everything off the top of their head unless I'm convinced they're a genius. If something wasnt already prepared, you dont half ass it in the meeting. it becomes an action item to be taken care of after the meeting. It's a waste of everyone's time to have anyone working out a problem live.


>>> imo an exam is much more akin to meeting preparation than the meeting itself.

umm, this is new... why don't you think the exam is "the meeting itself"?


The exam has no intrinsic value. It’s just there as part of the learning process. Most classes you can see a division between people who see grades as an objective (will this be on the test?) and people who see grades as a metric for another objective.

If you see grades as an objective, then the exam is something to study for. If you see grades as a metric, then the exam is there to measure progress and guide your study.


You do not know in advance what problems you need to solve on the test. You know the topics, sure, but you show up to the exam to perform work.

In a well designed meeting with a client, you show up to present work already done - or discuss what work needs to be done. Internal meetings might be more ad-hoc and you can get brownie points for working off the top of your head but its not necessary


> I don't understand why he decided to do his experiment after the students told him it was because they didn't understand the words in the book.

I assumed that the student said this because they were blind to the fact that their poor performance was due to inattention.


That’s like saying that calculators eliminate the need to learn arithmetic — the reality is that fluency enables you to do more.

Knowing things is more important than ever. Googling crap on your phone is also too slow and too failure prone.


Hey I can just look up a simplex noise generator on Google! Oh you need me to develop a noise generating algorithm from scratch to meet a certain set of requirements? You better hire someone with a proper education then.


> Also learning to remember to complete a closed book exam is not a useful skill anymore.

Looking up stuff is no good if you can't integrate it with a pre-existing scaffolding of concepts and knowledge. That pre-existing knowledge is essential to being able to make use of the search results.


> learning to remember to complete a closed book exam is not a useful skill anymore.

Unless you need to pass a whiteboard job interview that depends on remembering some obscure algorithm. ;)


> Unless you need to pass a whiteboard job interview

No one _needs_ to pass those kind of interviews. There are plenty of jobs that doesn't require that at all. But if you want to, it is sure handy to be able to remember obscure algorithms.


> Also learning to remember to complete a closed book exam is not a useful skill anymore.

The exam is going to cover some well-tread parts of whatever subject you are studying. For most reasonable exam questions, especially at undergraduate level, you can find some answer online without having to do the reasoning yourself as long as you understand the terms. However, I would like to see undergraduates prepared to answer questions which are more obscure or unusual, and being able to reason through these questions with your knowledge of the field is a critical skill to make this possible.

Additionally, you should be spending some of your time memorizing things. This is just necessary to have some kind of fluent conversation with other people studying the same subject.

My personal preference for most subjects is a combination of open-book and closed-book. It’s hard to design the closed-book portion so that it doesn’t come down to a memorization game. The solution I usually prefer is to create a larger test, and allow students to a subset of the test to answer, but this creates a larger burden on the graders and the instructor creating the curriculum.


It worked fine 20 years ago.


The author is best known for complaining about modern education and having a low opinion of his students. The fact that he would respond to students not understanding the book he assigned by telling them their phones are the problem is unsurprising.

Here is a pretty good summary of his MO which apparently hasn't changed much since a few years ago. https://hookandeye.ca/2016/03/23/the-unbearable-privilege-of...


ahh, I did think that first student "quote" seemed made up by someone who doesn't listen to students.


Indeed, most of my education was without (smart) phones, and teachers rarely had my raw undivided attention. I were quite frequently working on more interesting stuff (hobbies, math, stories), daydreaming, etc. Students are humans, and they have preferences -- and that's an excellent sign (that creates differentiation, that makes students more cognitively flexible, more adaptive). My preferences don't align perfectly with the curriculum and I were free to explore as I wanted (without disturbing other students of course), as long as I did my duty of learning enough to pass the course. In the worst case I've always had the refuge of my mind.

To demand sole attention unconditionally and continuously is thought-police-alike.

That said, there are certainly perils with too much distraction and smartphone usage. I'm a firm believer in human instinct guiding toward positive usage of technology, but there are unquestionably failings in our instincts -- there have been for quite a while.

Television was a big one for me -- it's immensely more attractive than a book; but books can be much more gratifying and educational once you get started; it's just more work. Yet there was always a lot of value added by judicious usage of television -- documentaries, non-vacuous talk shows, good film, etc.

Online games too -- I've spent a good chunk of my childhood in front a PC playing an MMORPG. Seems like a waste of time (and some of it was), but I've experienced a world of politics, fierce competition, intense cooperation, trading and all sorts of economical endeavors, friendship, cruelty and kindness in the safety of my home, with people of diverse backgrounds and cultures; I've learned a English and became extremely proficient almost entirely through the game. It also ran the risk of getting me addicted and isolating me entirely from real life social interactions; that risk was averted (and I would certainly still have a dearth of social interaction due to family problems) -- mostly through my own good judgement but also from pleads of caution from my parents and siblings. I still remember events, lessons and friends from those days.

Surprise: we'll have to keep dealing with the ambivalence of technology.

---

"The one solid fact is that the difficulties are due to an evolution that, while useful and constructive, is also dangerous. Can we produce the required adjustments with the necessary speed? The most hopeful answer is that the human species has been subjected to similar tests before and seems to have a congenital ability to come through, after varying amounts of trouble. To ask in advance for a complete recipe would be unreason­able. We can specify only the human qualities required: patience, flexibility, intelligence."

From 1955: Can we survive technology?

http://geosci.uchicago.edu/~kite/doc/von_Neumann_1955.pdf

"What kind of action does this situation call for? What-ever one feels inclined to do, one decisive trait must be considered: the very techniques that create the dangers and the instabilities are in themselves useful, or closely related to the useful. In fact, the more useful they could be, the more unstabilizing their effects can also be. It is not a particular perverse destructiveness of one particular invention that creates danger. Technological power, technological efficiency as such, is an ambivalent achievement. Its danger is intrinsic. "


He's also one of those people who fetishizes the humanities. See his article "If I Didn't Laugh I'd Cry: An Essay on Happiness, Productivity, and the Death of Humanities Education"

> Productivity has become the raison d'etre of Western capitalist societies, supported by its fundamental principles, quantity and impact. Both of these principles have taken hold in universities, and both place" useful" applications of knowledge above exploration of the human condition, above doubting, questioning, and wondering. The predominance of this ethic has forced humanities faculties into the awkward position of either repackaging their offerings so as to support the productive ethic or insisting on their integrity and facing charges of irrelevance. Though the strategy suggests choice, both options lead to the same end: the elimination of genuine humanities education. No wonder humanities professors are unhappy. If we are going to learn once again what a genuine and robust education in the humanities is about, we're going to have to explore that strange thing on which humanities education ultimately rests--our humanity

I find such essays to be bland to the point of absurdity. Humanities advocates repeatedly claim that "only the humanities" provide certain skills like "critical thinking" without really presenting any evidence to the point. In my (admittedly anecdotal) experience, engineers took classes outside of engineering and found them thought provoking but lacking in rigour while humanity majors simply found engineering classes too hard. My advice for those seeking a well-rounded education is simple: study engineering and take electives in humanities. Don't major in the humanities, because you'll surround yourself with people who look down on the hard sciences, and end up with a distinctively not well-rounded education.


While I take a lot of issues with the article itself, I concur the core message completely due to having done the experiment myself.

Two months ago I gave up my phone for a week, for a class on ethics along with a single other student.

Not being able to listen to podcasts, audiobooks, and music everywhere I go, I fell into so many more conversations than I normally would in a week.

what I realized was that, while I didn't feel addicted to my phone I definitely was missing out on a lot of unique interactions by wearing headphones everywhere I went.


> I definitely was missing out on a lot of unique interactions by wearing headphones everywhere I went.

As someone who works in a big city, this is why I wear my headphones everywhere. I am usually not even listening to anything, I just don't want random people to engage me.


That seems quite irrational.

I don't wear headphones in public very often, and its rarely that someone wants to interact with me. Usually its just to ask where some street is, and it actually feels quite good if you are able to help someone.


I appreciate where you're coming from and wish that were usually the case, but where I live 99/100 public interactions are to guilt you out of your money one way or another. Everyone's got to make money somehow so I respect their right to ask, but if I'm not in the mood to give away cash why waste their time by acknowledging them?

I've never had to beg for money, but imagine you _really_ need a few bucks ASAP. You get someone's attention and they start talking about their problems, offer you some half eaten food, etc but ultimately refuse to give you some money. I would rather these good-hearted but cheap folks ignore me so I can talk to people who are in the giving spirit.


While I take a lot of issues with the article itself, I concur the core message completely due to having done the experiment myself.

This has been my experience in teaching as well: https://jakeseliger.com/2008/12/28/laptops-students-distract...


My experience with education is that you are often forced to write in order "to please" the teacher, rather than express yourself (if you want the proper degree). So personally I do not really believe in any of those numbers.

Edit: At least not on academic level, there I think it looks bit differently.


My biggest peeve is the length requirements with writing. There's no need for fluff unless you're going into literature. People need to learn how to succinctly get their point across.

I remember we'd have reports where we need to write 10 pages and I would finish everything in 4 pages and have no idea what to do after that and then it just becomes a game of bullshitting to draw it out.

Same thing with essays where it needs to be a page long and I'm able to answer on a paragraph.


I have actually repeatedly heard the same complaint from professors. They don't want to read a bunch of 20 page papers when the standard in the field for (technical) publications is like 4. So they put maximum page limits on, which makes vastly more sense all around.

Maybe in philosophy it's necessary because the standard media is books or long essays so making sure you can play the "game of bullshitting" is a useful vocational skill in that field, to make your text long enough for people to buy your book.

No offense intended to philosophers, I love philosophy. But practical considerations like earning a living from a philosophy degree are perhaps different. It's kind of hard to sum up a new philosophical treatise in four pages since you'd probably take that long just to define your version of what "good" means.


My philosophy papers always had the most strict page limits. Try arguing against Nozick in 3 double spaced pages...


You should be able to make that case to your prof. You would probably get points for it :)

Some profs are subject matter experts who have no actual understanding of Pedagogy. So they pick up whatever rules someone else tells them to use, and then with time figure out what works and doesn't from feedback (which students are always very hesitant to give).


We had informal requirements (or better, "expectations") for the length of a PhD thesis (physics). It was about 200 pages.

Mine was a whooping 40 pages. Introduction to the subject was two sentences and the fact that if you need an introduction to the matter you should probably not be reading this thesis which deals with a super specialized area.

One of the reviewers refused to read it on the basis of the size. The four others looked relieved not to have to go though the history of physics again.

I hope that things are changing in this age of blogs and concerned information, the French idea of a thesis made of published articles and a few pages of glue is great.


One of the best and most challenging classes I took in university was a course on Old/Middle English lit where the professor required us to write a paper for each class (two classes per week) with a one page _limit_. It was fun and really hard.


The point of essay is arguments and making reader believe you. Of course you can answer question in one paragraph, but that is only part of point. Another one is to verbalise multiple arguments to support and deny your position.


Right, so "write an essay following X structure that's Y paragraphs long as defined by the structure" (eg three arguments for and one countered argument against, plus an intro and conclusion) is fine. "The paper must be at least 6 pages long" is not. I'd much rather read and write "this uses Mouse A rather than B because A was much cheaper for similar functionality" instead of "the reason for which the first discussed mouse, Mouse A, was chosen to be used in this project instead of the alternative, Mouse B, was due to a cost-benefit analysis. The first mouse performed various tasks at near or surpassing ability when compared to the outputs of Mouse B; with this, and the significant price discrepancy heavily in favor of Mouse A, it is clear that the minor detriments in functionality shown by Mouse A are easily outweighed by the the more potent upside of being notably less expensive than Mouse B."

You may notice that the second example is TERRIBLE writing: too many modifiers, uncomfortable sentence structure, repetition of full names instead of pronouns, uses and defines "cost benefit analysis" instead of... not doing that. But it's longer! Easier to fit that length minimum! If you intend to require a depth of argument or a number of pros and cons, say you'll grade on that.


1.) I have never seen essay about mouse selection. While possible, that would be purely exercise in structure. Structure exercises exists, but again rarely about topic like mouse selection. Just a bad match to format.

2.) The style of writing in your second example would not get you good grade in writing course. Simple as that, while long enough, it is not good enough.

Depth of argument is not contradictory to good sentence structure. Or to good overall structure. The essay is not actually graded on smartness of argument, but on whether you present them well.

And yes, writing courses focuse on writing elements of text. Just like programming assignement focuses on coding and less on your work being useful.

----------

What you did there is trying to hack writing assignment. It typically does not work, teachers are typically not that dumb. They would however simply call that "bad writing" or "did not put in effort" rather then flattering "hacking the text".


This can definitely play a role. Over the past several semesters all I've been doing is writing whatever the teacher wants to hear: their interests, political leanings, ideologies, everything.

Why? Because it works and because I'm trying to "hack" the system, in PG's words.

Back to the point, I think some students may very well just write whatever they think the teacher wants to hear and compounded with those extra credit points, you get selection bias in it of itself. The teacher may have a point, but maybe take it with a grain of salt.

As a teen I see a ton of people using their phones constantly and decided to go cold turkey after my junior year (no social media except twitter to stay up to date with ML research and HN for cool articles). Now instead of spending time addicted to whatever everyone else is, I spend a ton more time coding and learning cool stuff.


Going to school and college in India, I always thought it would be better in the west, where I assumed teachers would, on average, be more open minded towards different opinions from students.

Cause bias against opinions expressed in writing assignments was a concern I had too.


For me it depended heavily on the professor. Some marked me down almost assuredly because I argued against their preferences. Then again, one professor would keep running arguments in their grading of my papers just to end it with "Disagreed with everything you wrote, excellent arguments. A"


I just avoided classes that had writing assignments because I was tired of having to ape the teacher's opinions.


I have never really felt compelled to "ape" a teacher's opinions.

Demonstrate a sufficient and working understanding of? Yes; which I guess could be considered aping. However I can't recall a time I've been outright penalized for challenging a teacher's take as long as some greater mistake in making my case was not at the same time undertaken. For instance, the misapplication of logical device, or tortured cherrypicking of data/context.

I've never professed to understand why many people view academia as an institution in which the end goal is to agree with the professor for the grade. I've always treated it as an institution for the search for answers to questions where the searching for good questions is arguably more important and more difficult than finding the answers. I can't count the number of times I've felt dejected because I've had an armload of data or observations that are somehow related, and I've not been able to do anything but stew in angst because while there was some meaning to it, I could never figure out how to formulate the question to which the observations were the answer.

Then again, I stuck around long enough for my Bachelor's, then got out. I might not have gotten to the point where there is a much higher stake in tipping the Professor's Sacred Cow.


> I've never professed to understand why many people view academia as an institution in which the end goal is to agree with the professor for the grade.

I didn't believe it either until I experienced it personally.


I often had opinions very unlike those of my ethics teacher and she gave me decent grades anyways as long as I made reasoned arguments.

Depends on the subject really. It's probably not a good idea to try any stunts in history etc.


You might enjoy this song, "how I failed ethics" by the magnetic fields.

https://youtu.be/Hu5dEXZ7DOY

  "Though majoring in Visual and Environmental Studies
  And minoring in History of Sci
  I had to retake Ethics from my Mennonite professor
  For whom my skepticism didn't fly.

  "The first time I made mincemeat of the standard propositions
  Establishing a so-called moral science
  And I declared morality an offshoot of aesthetics
  And got a failing C for my defiance."


It sounded like the papers were ungraded. Which is something I encountered a few times in philosophy seminars and really appreciate.


I had a class or two taught by a teacher like that, but the majority of the time this wasn't the case.


Today, students should write "to please" future employers and social credit rating algorithms.


> My experience with education is that you are often forced to write in order "to please" the teacher

It, honestly, doesn't sound true.

Teachers have ego, yes, but they also can recognise value in student's work, even if it doesn't please them.

Of course the wrong answer is wrong, even if the student put a lot of effort in it.


Many students interpret bad grades in writing assignments as bias on the teachers’ part. That can happen, but more often the position or argument expressed isn’t necessarily derived from knowledge.

Aping the teacher/professor is the fast path, not only for the purposes of being agreeable, but because the argument is well thought out at some level. Part of the educational processes is being able to make an argument that you don’t agree with.


Humans are able to fill in the blanks on arguments that they already understand and do so implicitly without even noticing. So, if you make an argument the professor agrees with and is familiar with, you can skip steps, and the professor will still grok it. If your argument is novel or unfamiliar, missing steps appear as logical leaps, even though those same missing steps appear as extraneous detail to somebody already familiar with the argument.


> If your argument is novel or unfamiliar

... you probably are a genius!

Students usually talk about things they don't know, understand or can relate to (yet).

Or, as Alan Kay put it

> Socrates didn't charge for "education" because when you are in business, the "customer starts to become right". Whereas in education, the customer is generally "not right"


Having foregone nearly all packaged food and all flying for about five years, I can report similar results -- discovering that what was supposed to be more convenient, bringing the world closer, connecting me to family, and exposing me to more culture did the opposite.

Next to nobody I've shared my experiences with considers going without packaged food and flying possible, let alone my results, but experience speaks louder than their speculation.


Have you done a writeup of how you made the switching to entirely non packaged food? I would imagine it would involve a lot more grocery trips?


I've written several blog posts on it and my two TEDx talks cover more.

TEDx talks: http://joshuaspodek.com/my-second-tedx-talk-what-everyone-ge...

Blog post listing of other posts: http://joshuaspodek.com/avoiding-food-packaging-2

Note: not entirely non-packaged. I fill about a load of garbage per year. Yesterday I threw out my load for 2019: http://joshuaspodek.com/emptying-my-household-garbage-for-th...

Here are reviews of my famous no-packaging vegetable stews: http://joshuaspodek.com/food-world-reviews


we've mostly eliminated packaged food for years now, it is usually healthier and tastier to make it yourself. Some things make it past that filter for convenience.

On the flying one, it is hard for me to understand. I literally never flew anywhere until I was nearly 30, where I had a work trip. Outside of work, I've flown once.


Seems about right. I did a self test on myself, leaving phone at home for the work whole week. No other restrictions, this worked great. It had the benefit of still using phone whenever I needed it, not missing interaction but having much more concentration while doing work related tasks and forced reasonable amount of self reflection time while in transit.

For permanent basis I would request work dumb-phone from employer, that will stay on my office desk.

Other than that, a lot of anxety related to phone got removed once the phone is on permanent silent, all unknown numbers are blocked silently, notifications are limited. Phone is for people who I want to interact when I can.

FOMO


Story has holes in it. They agreed to part for 9 days, yet after only two weeks they began to think phones are limiting their relationships.


The teacher has done this for a couple of years, and I think the timelines have become mixed.


If most of the students did bad in the exam, the problem lies with the one factor in common: the professor.


As a college professor I agree completely. I forget who I heard if from but I remember someone saying that if your class is failing the problem isn't the students it is your pedagogy.

Certainly there are always individual students who fail but if most of them fail the midterm and it is only afterwards that you learn they don't understand the book then you aren't doing a very good job of tracking their progress and you aren't actually teaching them.

I don't know how large his classes are. It can certainly be hard to monitor what students are doing in a large lecture hall but in a smaller class of 20 students or so I haven't had much difficulty with students who wouldn't put their phones away when I asked them to.


In this case, maybe (I haven't read the article). But generally, I'd be wary of saying this. It conflates a complex subject to a single, easy answer which, incidentally, abolishes students of any responsibility for their own learning.


And it abolishes the professor of responsibility as well.

Not to say the author is entirely full of it, but broad generalizations blaming the existence of cellphones for the problems of our entire culture is way too reductionist.


> If most of the students did bad in the exam, the problem lies with the one factor in common: the professor.

Class distributions vary significantly. I TA'd elementary computer programming for seven quarters.[1] Some classes were more engaged, some less. My worst was a TR 3pm: a quarter of the students failed. My best was a summer course in which I taught without a computer (except for the labs): the room wasn't equipped, so I printed the slides and wrote on transparencies.

My N of 7 isn't significant (average grades ran from 72-82%, I think, and I don't know the standard deviation), but it's worth thinking about the problem more globally than your comment suggests. We often think of student grades statistically (work that curve!), but we rarely think about it with respect to a set of classes. It's likely that this professor has had classes that were very good and very bad and some that were in between.

Reflecting honestly on my own career on the other side of the desk, I was not a very good student. I studied to pass tests, not to learn. I got a degree because failing me would have looked bad for the institution, not because I was particularly deserving.

The humiliation of that experience turned me (I think and hope) into a life-long learner--and it informed my teaching in the classroom. Failure is one of the best teachers to those willing to learn her lessons.

[1] We team-taught occasionally and always team-graded the exams (one question or section, depending, per TA) to help level the grading. We had a department policy that failing the final exam meant failing the course. We encouraged collaboration among the students on homework--less so on labs--reasoning that it would strike a balance between understanding the material and acquiring needed soft skills. Office hours policies were quite liberal: all students were welcome at all TA hours, so if you had an issue with your instructor you could visit the TA and supervisor. Complaints were taken seriously, and underperforming teachers were removed. My reviews were generally good, with the biggest complaints about my language. (I modified my class structure to spend a few minutes on vocabulary each day in response.)

{{ Edit: typo. }}


As a TA I led computer labs for a large intro class for several semesters. One semester I got stuck with the 8am section, which I wasn't terribly excited about (especially since it was winter), but it turned out to be great--the section was smaller than the others because most students also didn't want to be there at 8am, and the students who did sign up for the 8am section were more motivated and self-directing than average.

Having to be on campus ready to teach at 8am still wasn't great.


I've given the same tests to my students 3 times - word for word and many students still failed. The good students who cared raised their scores with each repeat. The non-nonchalant students sometimes, managed to score less.

I don't keep my intention to repeat test questions secret. I do this to prepare them for pre-uni, university and professional exams which always reuse questions.


I'm curious: In what way do you think this is preparing students, if they - in your judgement - do not appear to be learning from it.


Is the statement below true?

>pre-uni, university and professional exams ALWAYS reuse questions.

Is solving past questions a useful SKILL / hack for student to learn?

If you don't agree with these, then nothing else I say would matter.

If other teachers repeated test questions, it would take less time for all students to adapt and benefit. I only taught in the first school for a term, the owner wanted me to stay but I had a better paying offer. Second school, two terms. By the second term, only 3 students consistently scored below 50% in tests. And of these three, one was extremely nonchalant. And remaining two were just slow.

Make no mistake about it - all teachers repeat questions. It's just that it takes too long for a test question to show up in exam - so most would see the exam questions as fresh ones.


So... you include a discussion of getting "better offers" to indicate that your services are in high demand, as a signal that your teaching style must be good. You can leave that part out next time if you want to be taken seriously, because teaching for 3 terms at 2 different schools is hardly something to brag about.

But I'm afraid it doesn't really address the problem at hand. What if your first exam was bad, and by repeating it, you were just repeating the same bad questions, teaching a useless skill of memorizing questions instead of understanding the material?


>because teaching for 3 terms at 2 different schools is hardly something to brag about

Context

>What if your first exam was bad

Good question but irrelevant. You ignored my questions. I want to believe that, I'm not the only one who sees that entrance exams to universities, university exams and professional exams repeat over 80% of their questions?

If all your exams were all freshly set questions, say so. If not, what wrong with preparing my students for reality?


Students should be aware that teachers can be lazy (like everyone else), and may repeat exam questions. It's an important part of exam strategy to teach that - by telling students. But this is a hack, as you say: it's not learning the material. If you don't have the understanding, then this hack is not valuable and may be very dangerous to rely on.

You are not preparing the students for an appropriate diversity of exam questions if you ask the same questions each time.

The students are perhaps doing worse because they have understood, probably correctly, that your testing is not a worthwhile exercise for them.

If you don't have novel questions, you are only demonstrating that it is unimportant to learn new material, or unimportant to find a deeper understanding of the existing material.


Wonder if you would have the same opinion if you taught in a poor neighborhood, kids who have English as a second or third language.

I would gladly admit that my method is shitty if:

They only failed my subjects

Some still couldn't answer copy&paste kinda questions - open book quizzes

Other teachers didn't pad their scores at the end of the term

I'm done with this thread.


> the problem lies with the one factor in common

Why not having a family?

Or living in the the same country?

Or having the same educational system?

Correlation is not causation.


- Because the original premise hasn't controlled for family, we can only work with the data we have

- Because other students in the same country succeed

- Because other students in the same educational system succeed

That argument only works if you manage to isolate the conditions to almost perfectly match a group, mine was "this teacher, teaching this subject" and the data fits.

Regardless, crying "my students use phones and all of them are failing" when there are students with other teachers that use phones as well and don't fail makes blaming phones an illogical conclusion.


> - Because the original premise hasn't controlled for family, we can only work with the data we have

I don't think data says what you imply though

> Because other students in the same country succeed

[citation needed]

The teacher in the article poses an ipothesys

- he has the suspect the phones might be partially involved in the problem

So he puts the hipotesys to test

- he asks his students what went wrong

- they answered, according to the article quotes “We don’t understand what the books say, sir. We don’t understand the words.”

- he asks them to not use their phones for 9 days, offering an extra credit as a reward

after the test he collected results. Among the other, I'll quote the first one

> “Writing a paper and not having a phone boosted productivity at least twice as much,” Elliott claimed. “You are concentrated on one task and not worrying about anything else. Studying for a test was much easier as well because I was not distracted by the phone at all.”

It looks to me that they were actually having a problem which is not their professor.

> Because other students in the same educational system succeed

Are they?

Across the Board, Scores Drop in Math and Reading for U.S. Students

https://www.usnews.com/news/education-news/articles/2019-10-...

> Regardless, crying "my students use phones and all of them are failing"

That's not what he did, what you are doing is manipulative, it's a well know bias called "false premise"

https://en.wikipedia.org/wiki/False_premise


Did I miss the part where he says what their grades were after this experiment? Without any specific improvements it is just someone complaining about kids these days.


It’s also the “kids these days” agreeing with the professor, which is definitely different from other get off my lawn type posts


I'm not quite sold on that either. The professor obviously cherry picks the quotes that would show him in the best light.

And students know what answer the professor wants to hear, and he is giving them bonus points on their grade for participating in the experiment. Not to mention that a student who participates in the bonus-point offer is likely a student who is doing badly in the class and would appreciate the assistance from the points and from removing the distractions, as they likely are the same students that think the professor knows what he's talking about, otherwise why would they volunteer for the experiment at all?

In that situation, it would not be difficult to cherry pick some supporting opinions. Remember, this is not exactly a controlled experiment.


Professors generally have academic freedom to try such stunts to get into media. But still seems like a pointless experiment in a philosophy class. Anyone know why colleges are not more focused on getting kids ready for working environment?


It's a simple pattern.

1. Offer extra credit for some wacky experiments with students

2. Make it clear to the experimental subjects what results you want to hear in advance

3. Complete the experiment without any rigour (blinding) and hear the results you wanted to hear

It's not at all surprising to me that a group of students sucking up to a professor for extra credit would say just what they thought the professor wanted to hear, whether they believed it or not.


Forget about giving up smartphones, would it be even possible to teach ourselves some notification etiquette[1] e.g. while sending a chat text setting the priority level for it i.e. if cat gif (delayed) and if something important (immediate) etc.

I think the app behemoths which gained from the notification pollution would never implement anything which forces etiquette to the user and that the well-being measures launched in Android and iOS are useless when it comes to limiting smartphone usage.

[1]https://needgap.com/problems/59-notification-pollution-mobil...


Remember the old days when a bunch of friends would agree to meet at any place and all of us would find each other after arriving at that place. Wonder if that would be possible today?


I think you conveniently forget the days where people were 30 mins late and you had to wait for them in the cold not knowing where they were and when they would arrive.


Cultural differences apply. I remember realising when socialising with Italian students, that arranging a time to meet was a precursor to 60-90 minutes of chat about what we might do, who was going to turn up etc, etc.

I warned a friend who I took on a subsequent visit, that meeting at 7.30pm probably meant eating at 10pm. And in fact I was 15 mins optimistic! This included a visit to someone's house, being invited in for a quick aperitivo, driving 40 mins to restaurant etc...


I my recollection, this hardly happened. And a friend would not do this.


That didn’t happen often to me, and when it did we just moved on.

Today, we’re all stuck in these passive aggressive text standoffs in prisoner dilemma situations about where to go or what to order.

Smartphones were superpowers when only a few folks had them. Now, they are not that!


People would be a lot more likely to turn up on time.


Remember Blaise Pascal's divertissement (meaning: diversion, systematic distraction, etc)? He was alive in the 1600s, and he was making the same exact case we are making today! So I wouldn't be too quick to imply the ol' days were the better days.

I argue the better days are when we become piercingly aware that we have been alienated from our true selves and consequently err in the opposite direction of worldliness and more in the direction of peace, silence, and solitude as a means to live the good life. As naive as all the above may sound.


Pascal included work in the divertissement

He wasn't talking about having fun he was talking about self distraction to avoid thinking about life

He thought that distractions, like working, served the purpose of not thinking about the misery of life, such as the ultimate fear of death.

Pascal never said that ol' times were better, he said that women and men, left alone with their thought cannot handle it.

According to Pascal, People only live in the future, imagining a life that does not exist yet, while they avoid to think about their past and present actions as causes of their actual condition.

It's more or less what happens today with the disposable nature of social networks content.

He wasn't making the same exact case, Pascal is one of the best philosophers in history.


That's interesting and original but rarely said here.

Only other mention of Pascal divertissement I could find: https://news.ycombinator.com/item?id=9450559


We had our own tools that previous generations didn't have. We would have had difficulty making arrangements without our tools, but we would have found a way. Kids today are no different. They have their tools and they use them. I'm totally ok with that.


It sounds like you think technology has fully eroded everyone's ability to plan. I doubt that's the case. We just use different tools now, and expend less time/energy on logistics.


I own a street atlas, but I wonder how many others still do..


Plot twist: most people I knew didn't own one "back in my day" either.


Probably more than you think. As smartphone maps have evolved, their design has changed and really sucks for many purposes.


Am I to assume that you haven't left your house since "the old days"? Why in the world would that be impossible.

Even more obviously: how exactly do you think people find each other when they arrive at a location? Are you assuming they don't use their eyes to look around?


As I read this excellent article my dog sitting beside me looked up at me several times to say "hey human, quit staring at that screen!"


Yes, cell phones or gadgets are making people isolated instead of making them more social. Keeping gadgets away from us will make us more creative and perform batter. Would like to experiment with me as well.


Wow. Really, really bold leadership here. He actually asked students to go without their phones ... the student's family were happy the teacher didn't ask for them too. Such power!

Meanwhile, a university professor,

- isn't mom/dad by pseudo punishing students removing phones

- isn't moonlighting as pseudo pysch researcher who guessed phones were the problem

- isn't a cool guru by wowing kids into wisdom with this unexpected ploy

If your students did bad on their exam

- make improvements as a teacher if you feel some responsibility lay with you (there's no need to gas on about this to students)

- leave the grades unchanged

- stop confusing the issue with stupid exercises then self marketing it into media


I recommend rereading the article once more. He asked student to volunteer their phones to him, he didn’t force remove them. He left the grades unchanged just offered extra credit to the students who were determined enough to do the experiment.


s/determined/desperate/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: