A loved AI: A Modern Approach—still the best all around textbook on AI imho. I'd only add Susanna Epp's Discrete Mathematics with Applications, and I'd just focus on methodically and thoroughly working through those two, personally. First, Discrete Mathematics, then AI: A Modern Approach. This is exactly what I did and it was a great experience, super helpful.
I went straight to work and didn't get to college until 41/42. I'm working on my undergrad and plan to continue to my masters. I've found it incredibly rewarding for two main reasons.
First, since I have a career already, I'm free of the pressure to go to school for career purposes and can focus on something I enjoy, which also provides immense value to what I'm doing every day (I chose philosophy, much more relevant and practical than I think many realize).
Second, I enjoy the experience and get quite a bit more depth from it than I would have in my 20's. It's a richer, more meaningful experience now that I'm older, have a strong sense of who I am, and am not put off in the slightest by naysayers or influenced by people's opinions of what I should or shouldn't be doing. I have more maturity now than at any other time in my life, and this has served me well in the sense of approaching topics with intellectual humility and just enjoying the process of knowing nothing to knowing a little. I do all the reading and then some, reading far and wide as well as doing deep analysis, writing all my notes, reviewing, and doing practice essays, and I enjoy every bit of it rather than seeing it as a chore.
So, some initial thoughts for you, hope they're helpful. The only advice I can give is to enjoy it, realize it's a wonderful opportunity, be structured and disciplined with your time, and use your hard-earned experience to your advantage.
I really second this. The best thing about study as an adult is that you're largely studying for pleasure, even if there is a work goal in there somewhere.
You know what's relevant and interesting to you and what's not.
Philosophy is essential, how it's taught is unfortunate. In my philosophy class it was almost all about the history of philosophy, not the reasoning behind it. Assignments and tests were all about time periods, the specific names of philosophical ideas, from whom they came from, etc.
I'd rather have open ended assignments. Ones that give moral dilemmas, and challenge their solutions. Make me think about something In a perspective I haven't thought of before. That's a powerful tool.
But that's how academics works, the culture wants tests and assignments with check boxes.
I can't help but come to the defense of the traditional style, particularly for introductory philosophical classes. The fact is, people have thought about every moral situation from every angle already and just asking undergraduates to wax on about the trolley problem is kind of a waste of time. It is much more valuable to get them into the detailed history of ideas so they can appreciate just how long these problems have been open and discussed.
I'm not sure where you went, but my undergrad philosophy courses matched your desired approach:. We were presented with problems, and presented solutions in return. Sure, we had to know the historical context of what solutions other have brought already... but our work was not regurgitation of those ideas, it was reconstruction into new ways to advance the discussions.
I have never taken a philosophy class that involved tests or naming philosophical ideas. The work of an academic philosopher is to write papers (like an academic historian or sociologist), so a philosophy degree should focus on writing papers. Often the papers will be analyzing previous philosophical work and attempting to present some novel synthesis of it, either with itself or with some broader context. Neither "giving moral dilemmas" nor "quizzes about history" fit anywhere into that picture.
I’m a philosophy grad who learned to code a couple years out of school.
Philosophy gives you a set of meta cognitive skills that help everywhere. It teaches you how to think. It shows you what class of problems are soluble, and which are things where we just have to accept tradeoffs. And it’s really focused, in a funny way, on economy: does your argument actually do something? Does this theory offer clarity and bring us closer to truth? If not, well, why are you wasting your breath on it? Philosophy teaches you to see that some avenues are fruitless or just kinda not worth the effort.
Also, non-practically, it shows you the full depth of wonder in the world. Wherever there is capacity for thinking to be done, philosophy says, you can elucidate something important to our human condition.
I hear a lot of non-engineers say this. Talking about formal logic, and how philosophy and math were once the same discipline, how math proofs are akin to philosophical arguments etc.
I don't think this crowd would get much out of it.
As an engineer I got into logic through the philosophy department. It was very eye opening for me.
Engineers are not models of logical thinking that they assume they are. Illogic is everywhere and it takes constant vigilance to avoid always going with your gut feelings.
In the very least, I think every engineer should take a "philosophy of science" class. We tend to focus quite intensely on how to do things. Borrow a little bit of "where do proofs and the scientific method sit in the grand pantheon of human knowledge" from philosophy is a bit grounding. Anyway it is probably a gen-ed that is at least somewhat useful.
It depends. Some people have really weak philosophical foundations and really need to hear about it if there is something out there that grounds them a bit better.
We can't say if any particular approach to life is the best, but we can say that if you change your mind about which approach is best at age 70 you've spent a lot of years setting up for the wrong outcome. It is never to late in theory. But as a practical matter 70 is a bit late to sit down, take a step back, ask why and try to act on it. Better for people to line themselves up with good foundations from their 20s or maybe 30s. It is good to explore the options early, and think a bit about what the word 'option' even means philosophically.
I agree. Philosophy gives you a level of abstract reasoning of the form: "if we agree (with Kant) that we should only take those actions which could be universal law, does it follow that the death penalty is morally justifiable?" There is some degree of reasoning from premises here, but all of the objects you deal with are things that you come into with a bunch of intuition that you never really leave behind.
On the other hand, something like:
> Given a one-dimensional invariant subspace, prove that any nonzero
vector in that space is an eigenvector and all such eigenvectors have the same eigenvalue.
really forces you to grapple with an entirely different level of abstraction
Kant is actually towards the top of my list of "stuff I thought was dumb before I read the actual source material but which I now have a lot of respect for." The categorical imperative stuff is a reflection of a really profound value that Kant assigns to human life.
Utilitarianism benefits a lot from having a Cliff notes version that sounds less dumb than the Cliff notes versions of other ethical frameworks, but I don't think that is the right way to evaluate ethics. Besides, philosophy class ethics is really more of an exercise in "let's construct a formal framework that matches our intuitions" rather than "let's make normative judgements about stuff in the real world."
> As others have already asked, could you expand on this? Very interested.
I started undergrad in my 30s, and also majored philosophy for similar reasons. It really is the most rigorous non-STEM undergrad degree you can get. And for people (like myself) who can’t pass calculus, but are still fairly intelligent, it can easily be parlayed into a more technical graduate program.
Everybody can pass calculus. The only way one can fail at math is gaps in knowledge. If you managed to get through something like Kant or Hegel, you can get through any math subject provided you have the necessary prerequisites.
I was a double major in philosophy and CS in my undergrad. Philosophy was fun, but in hindsight I wish I did math or stats or some other STEM instead. I would say my main takeaway from the philosophy degree was developing a sense of intellectual respect for big, important ideas that I don't personally agree with (various religious thinkers, Marx, Aristotle etc), but it really doesn't compare to the actual nuts-and-bolts abstract reasoning skills you pick up in an abstract algebra course, for example.
I also found that I could consistently get As in humanities courses with ~20-40 hours of work per quarter (the time to write 1-3 papers) once I picked up the skill of "writing like an academic", vs my CS courses which continued to be challenging and require a ton of effort to succeed in up until my graduation. My senior year, for example, I had some core-requirement course about theater -- I attended zero classes and did zero readings until I sat down to write the paper, and I got As with compliments from the professor on how well-written my papers were. YMMV.
I doubt there is significant transfer between philosophy education and other tasks (like programming). Curiously, the people who should doubt this conclusion (the educated philosophers) are the ones that jump to accept it. Anyways, the literature on this matter is wide enough that our prior should be that there is no transfer and evidence to the contrary must be stated.
My daughter is a philosophy major while most of my family has been STEM for generations (father's an engineer, mother taught college math, grandfather was an engineer). I'm reassured by the requirements for formal logic, and the obvious applications in law, but also at the intersection of law, ethics, and many of the ML systems that I foresee coming online.
She actually brought up this Harvard philosophy professor who had a story about keeping track of parantheses. I took advantage of the opportunity to show her the connections to Curry and from there to Lisp and the Little Schemer. She got it. She can reason, formally. That's important.
Just to offer a counter-point to the others here, I've personally noticed that quite a few people who study philosophy (either formally or via self-study) tend to become "disembodied". Formal reason becomes king, even when informal methods are more appropriate for solving the task at hand, and the intangible becomes irrelevant, even when it matters deeply.
Perhaps things would be different for someone in their 40s, who has a wealth of real world experience to draw on, philosophy would be valuable.
But for the average 18 year old kid, studying it seems to create a set of terrible habits that take years to undo before the student can become a properly integrated adult.
> Perhaps things would be different for someone in their 40s, who has a wealth of real world experience to draw on, philosophy would be valuable. But for the average 18 year old kid, studying it seems to create a set of terrible habits that take years to undo before the student can become a properly integrated adult.
In the time of Plato and Aristotle it was frowned upon to teach philosophy to students below the age of 35 because they wouldn't know what to do with that knowledge.
I started as a philosophy major and switched to history and economics for this exact reason. My philosophy classes became incredibly disconnected from reality and ended up being endless arguments about frameworks and formalisms, but without the rigor of mathematics outside of formal logic. I still loved my philosophy classes, but I'd recommend anyone studying it as their first undergraduate degree pair it with something more concrete.
I guess philosophy education tends to vary from region to region, but "young people into phil" tend to be insufferable in one way or another, while older people with an understanding of philosophy (...and a great many real world problems) tend to be pretty okay people.
But that seems to be a broader issue with specialization anyway. Focus on one lane for too long and your brain starts to disfunction in odd ways.
Upper level proof based math courses are much better in my experience (speaking as someone with a BA in philosophy who has been taking math courses part time for the past few years)
Re: "Now there is still a lot of golden days, but it will never be like you were 20, for sure.", this is not necessarily a bad thing. 20-somethings move so fast because they don't have the experience to move strategically, to stop, think slowly and methodically, gather perspectives and advice, and make deliberate, powerful moves. Older people don't need to be so hyperactive; we move with more efficiency because we don't need to move as MUCH. Learn to play chess rather than checkers. This worship of youth is disease.
There is no zenith. Own your life. Get out of your head.
Everything you've done up to this point has only been a prelude to whatever you choose to do next. If you choose to see those years as your zenith and stop doing anything, that's on you, it's not because you've supposedly reached some mythical temporal horizon. You've accrued experiences, skills, wisdom, perspective, understanding, relationships, and resources. If you can't see a way to turn those into meaningful next pursuits, then (again) that's on you. Don't make excuses for yourself.
Our culture unfortunately worships youth. We seem to think that life after 40 is downhill, but this just isn't true. From 40 to 70 or even 80 is a PRIME period of life. You have the things you need to do world-changing work, the confidence that comes from hard-won experience, and the tangible life experience to appreciate the meaningfulness and beauty of every single moment.
So-called "zeniths" are whatever you decide they are. If you reached your absolute best in one skillset, pick up a new one. Reinvent yourself and kick off a whole new career. Serve the people around you. Try to make the world better somehow for your having been here. There's no end of things you could do. Don't waste any more time with this angsty self-pity. Life is too fleeting and too wonderful to be so self-absorbed.
I'm not sure if this is not just an illusion. Young people are maybe a little more visible in meaningless parts of our society but almost all truly powerful and rich people are old. Don't fall into trap reading Forbes list and thinking that it shows any reminiscence of the actual power structure - most people is there only because they are in the business of self promotion. The real ones are those that are literally paying to stay out of the spotlight.
Agree. The way I see it, not being in my 20s is an edge, not a liability. I don't have to suffer all the things young people are about to because I've already been through that part of my life. There's a lot you don't have to worry about any more once you get to the other side of it. Plus, I've had plenty of years to make mistakes and have a better sense of who I am (and who I'm not) and what I want (and what I emphatically don't) than ever. In my 20's everything was about screwing and appearances and destroying the old; now it's all about Love, Beauty, and Creation.
There's just no question to me—life really doesn't even begin to be amazing until all the noise quiets down sometime in the 30s–40s. That stage is important, we NEED to burn hot through that period and it's important, but it's by absolutely no means whatsoever some kind of "peak"—that idea is so much bullshit.
If there is such a thing as a "peak", I feel like it's probably more accurately something like the 50s, with a long tail through the 60s and beyond. Those decades make up (imo) the window where you can bring all the things you've lived to bear and live the culmination of all your insights—powerful stuff.
Discrete Mathematics with Applications by Susanna Epp is the one book that I feel took me from simply knowing how to write programs to being a software engineer by filling in the theoretical blanks I had from being self-taught. Discrete mathematics in general is so wonderful and applicable to day-to-day software engineering problems and to me, that book is the best, most coherent, and most thorough one available on the topic.
My top 3, in order of how I try to apply them (i.e., if 1 doesn't help, move on to 2, etc.). I learned these all from reading various philosophy works, by the way, so perhaps cognitive hack #1 should be "read books".
1) Suspension of judgement (from Sextus Empiricus, Zhuang Zi, Ecclesiastes): avoid forming an opinion at all about things that are not evident. The way I do this is by thinking through an opposing argument or two, and using language like "it seems" or "it appears" rather than "I know", "I think", etc. This technique saves time and energy by helping me avoid getting wrapped up in opinion-based thinking and helps me develop equanimity.
2) Suspension of value-judgements (from Epictetus, Marcus Aurelius, Seneca, Zhuang Zi, Ecclesiastes): being aware and in control of the value-judgement loop (this thing is good or bad). I do this by shifting the language in my mind from "that is bad" to "I feel this way because..." Again, like #1, this is about inverting the locus of control in my cognitive discourse such that my mind can easily go its own way from there, only on a more productive path.
3) Awareness of the mode of thinking I'm in, and the kind of learning that's appropriate to the task or objective at hand (from Plato). There are several modes of thinking or learning (eikasia, pistis, dianoia, episteme, techne, phronesis, and noesis, for example). Simply being aware of which mode you should be in for a task is much more valuable than it might appear at first glance. I see these less as bins to put various kinds of thought in and more as tools to apply to a problem.
Reviewing this, a common thread is self-awareness developed to a point of disciplined introspection and intentional change by adopting these kinds of cognitive tricks. Also, reading is good for you. :)
0) Keeping one's mouth shut. Trying to not have an opinion until a) having enough information b) only voicing it if it is important. I am trying to spend more time perceiving instead of broadcasting.
I do this too and it works great in personal contexts cause people love sharing their opinions and hearing themselves talk generally speaking.
This is in stark contrast to the workplace where I've experienced that keeping my mouth shut in a meeting to gather my thoughts before contributing a well informed opinion, the loudest person in the room has already spoken a handful of times and left their mark and then continues to speak over people and dominate the conversation for better or worse (usually the latter unless they are a SME).
I'm then forced to revert to speaking ASAP to get a word in so I don't walk away from a meeting being perceived as contributing little cause I was getting enough words.
Maybe it's all in my head cause I'm an introvert and meetings drain my energy. Anyone else experience this? Got any tips?
Amen to that! I often end up in discussions and decision making at work were I haven't even scratched the surface with my knowledge of the problem.
Trying to listen and absorb as much information from the people that know most about the problem is my way of trying to quickly be able to make a decision (if that's needed).
Agreed. And the harder aspect of this is to actively listen to all parties at the same time holding the questions in my mind and allowing time in the discussion for them to get answered. Notes really help here.
Watch women and men in a meeting, men will talk over people, interject wisdoms, either comments or questions. While women will sit back and let the questions get answered, and then when enough time will go by will ask the unaswered questions.
One of the hardest things for a group to do is leave enough dead air so that others can speak, esp over remote connections.
Listening is an important skill. To avoid being perceived as passive or absent, "active listening" [1] can be helpful. Just remember that you are communicating via a shared medium (half-duplex) so sending should be kept to the necessary minimum ;-)
A corollary: if something is true, but not relevant, then why should anybody care if it is true? Know when you're putting your efforts into things that can't pay off.
These are forms of adjusting for bias and one should be cautious when they (or anyone) think they're good at accounting for them. That said, there are times we do feel justified in our decision because we feel we have adequately looked at "all sides."
At times the process of taking a neutral approach and/or vocalizing that neutrality demonstrates weakness. "Plan X makes sense because of A and B, but it does have this trade off D. That said, Plan Y could make sense if we really value D." In meetings, another person often lays out a single POV strongly and wins. Ideally, this opinion is strong because it's well thought out (and bias-adjusted), but sometimes it's strong simply because it's stated as such. "Plan Y is right. Because D, which I didn't even consider until now. <No mention of A or B.>"
Here, one would hope the group or group leader checks and balances this type of behavior so that a single biased-person doesn't carry a generally clear-minded group, but of course that doesn't always happen.
Considering one's biases is a very good thing, but I wonder if the pragmatics of group dynamics render it something good for the soul, but poor for action.
I think the post was referring to individual projects, not group dynamics, but your point is certainly true of group situations. Sometimes aspects of the group harm the group's ability to reason or behave rationally.
I'd argue that even in a one person project, information becomes available gradually and one must learn to re-consider old conclusions again in light of new information, even to re-consider old questions that might have been easily dismissed earlier.
Human rationality is biased by our use of the heuristics that take a few milliseconds of brain time and work pretty well, but have failures when trusted too much when we have the luxury of more time to consider the information. Even things like a member of the group asserting something confidently can throw off the rational faculties a bit.
Agreed. I use qualifiers far too often. It's good to be open to being wrong, good to not convey over-confidence, but when you are the one operating outside the social norm it dilutes whatever point you were making.
I am trying to get in the habit of providing an overarching qualifier so that I'm not deceiving my audience and then making the rest of my statements without any qualifiers that would be covered by the overarching one. It is definitely a hard habit to change, however.
These techniques are more general psychological tools than specifically about controlling bias. They're also meant to be internally-facing, not necessarily for sharing outside yourself. In the context of the original question about "brain hacks" (not necessarily group dynamics) the underlying thread is finding ways to change your perspective in order to gain insight, which is a very useful "brain hack" indeed!
You're right, but for many people it's difficult to become a thought leader until they have first improved their skills in reserving judgment and combating bias. I've met people who seem to have a strong opinion on everything, and they convince some people, but their opinion meets an unfortunate end when they present to a more discerning group.
I agree 100%. In theory it’s good to consider all perspectives and not acquire hyperfocus/tunnel vision on any one plan or destination. But I n practice, people seem to find leaders with tunnel vision far more convincing and worth following than those who take a more multi faceted/ less focused approach to leadership IMO
Maybe that’s because, it seems like the practical reality of discussing ones attempts to be bias-free always comes off as disengenuous no matter how genuine those attempts may be. So fuck it man. Embrace bias :)
I liked this too, and given the sister comment asked for a more colloquial description, I came up with my own:
1) Instead of using firm statements attached to yourself like "I know" or "I think" use statements that are easier for you to contradict and discard without feeling like you are attacking or discarding yourself, like "it seems that..." or "it appears that...". Then you can say "it seems that X but in contrast it appears that Y". That is better than "I know X but I could be wrong because Y".
2) Don't immediately describe things as good or bad or the right way and the wrong way, that will prevent you from seeing an alternative solution because you will have automatically labelled it as wrong when you labelled something else as right. Instead of "that is bad" or "that is the right way" to "I feel this way because...". It is easier to change a decision based on knowing how you feel and have felt than when you have "money in the game" as having said (even to yourself) that something was the only right way.
[Note: In general, being able to step back and pick the right thing because it is right is useful, but only after you have honestly considered the situation in a fair and impartial way. Being too partial too quickly cuts off your ability to think and accept better solutions. If you already knew it, you could just choose, but the point is you are trying to think through it, which means you don't know but are evaluating how you feel and how things seem to be, but trying to uncover more.]
3) Understanding your current state of mind, and what state of mind would work better for the current task is important, so you can match those up, or at least understand more deeply. These are things like the appearance of things or imaginative situations (eikasia), good faith and trust or persuasiveness (pistis), discursive thought (dianoia), theoretical thought (techne), practical knowledge (phronesis), or intuition (noesis). [Note: my comments on the meanings of the Greek terms are probably wrong, but...] You could combine multiple of these to reach your goal; and you will often have a sequence of thoughts that are in different ones.
Introspection is _huge_. If you can understand and evaluate how you think and feel; you can start to move past that, or use it to your advantage, or see the flaw in your own reasoning, or be more observant, or form new habits, etc.
Personally, I like yoga for this reason; it helps to accept your feelings as they come and to observe them without immediate judgement. I think it helps in developing intuition, controlling your thoughts and emotions, etc.
So, extending your thought about common thread, one needs to develop self-awareness. Hacks won't help much with this. But there is a systematic way of cultivating it, called meditation.
Another useful one coming right out of the Stoic tradition is the trisection into things that one
1) has no control over, such as weather, other people's actions, etc.: do not fret about those.
2) has complete control over, such as one's thoughts, judgements, response to events, actions, etc.: concentrate on these.
3) has partial control over, such as one's health, reputation, etc.: do not fret about the outcome, but prepare/do your part as best as you can.
This is well explained in William Irvine's Guide to the Good Life [0], but has already been proposed by Epictetus in the Enchiridion [1], 2nd century CE:
> Work, therefore to be able to say to every harsh appearance, “You are but an appearance, and not absolutely the thing you appear to be.” And then examine it by those rules which you have, and first, and chiefly, by this: whether it concerns the things which are in our own control, or those which are not; and, if it concerns anything not in our control, be prepared to say that it is nothing to you.
At the risk of substituting books with URL's... Do you have any to any of these concepts? For example I just tried googling "eikasia, pistis, dianoia, episteme, techne, phronesis, noesis" and I didn't, eh, come up with something precise enough to read.
EDIT: Someone down in the thread did the kindness already!
It's substituting URLs for books — either that or replacing books with URLs. [0] This might seem pedantic, and I know that language evolves, but:
1. We risk confusing, and therefore we disserve, non-native speakers if we use the language in ways opposite the accepted standard meanings. (One could think of language as an API, and no matter what the API, precision and accuracy often matter.)
2. Like it or not, the brute fact is that people who use language in non-standard ways are often judged harshly — and silently — for it. One can rail against that, or one can shrug one's shoulders and simply conform to standard usages, at least in public- and quasi-public forums. (In other words: Pick your battles.)
“3) (transitive) In the phrase "substitute X with/by Y", to use Y in place of X; to replace X with Y as in
‘I had to substitute old parts with the new ones.’ (This usage was formerly proscribed.)“
“Traditionally, the verb substitute is followed by for and means ‘put someone or something in place of another’, as in she substituted the fake vase for the real one. From the late 17th century substitute has also been used to mean ‘replace someone or something with something else’, as in she substituted the real vase with the fake one. This can be confusing, since the two sentences shown above mean the same thing, yet the object of the verb and the object of the preposition have swapped positions. Despite the potential confusion, the second, newer use is well established, especially in some scientific contexts and in sport (the top scorer was substituted with almost half an hour still to play), and is now generally regarded as part of normal standard English.”
Who are we to argue with the Oxford English Dictionary? You can take it up with them. lol
Have a nice day :)
(I will say that the “sport” example is bad- the player is not being substituted with the time!)
[Grumbling:] So now even the OED is in on the global conspiracy to degrade the language, eh? [Expletive of your choice], what the [expletive] is the world coming to these days? :-)
> people who use language in non-standard ways are often judged harshly — and silently — for it
Yes. In my culture, the people who can't speak or write correctly are the ones who are uneducated - usually primary school or less. It's basically the local equivalent of rednecks.
Oh, and appreciate the correction regardless because I too think English is hard enough for foreigners to learn without these extra introduced wrinkles
Thank you, but I do not blog regularly. It's an interesting idea, though, and I appreciate the sentiment. However, none of these are my thoughts, and all of them are easily discovered in the same places I found them myself! :)
I'd say another common theme in these is language hacking (I'm assuming the different thinking modes come with their own language constructs).
And that's the big thing that's changed my life around.
Specifically, how I pulled off the first two was with a practice I'd do when I'd hear a word with an antonym. I'd shrug and say "meh...<original word>...<it's antonym>...meh." I came up with this practice because I wanted to learn to suspend/minimize judgment without learning about types of judgments or bothering with distinguishing them. I went as broadly as I could, instead. I noticed different effects that seemed to correlate with sets of words. Right/wrong helped me see other perspectives, stop demonizing others, and realize people do things because that's how they learned to be. Should/shouldn't helped me release shame. Good/bad helped me stay positive and withhold judgment of the moment. Like/dislike I think joined with good/bad to release judgment of things/experiences. What I mean by this is after maintaining this practice throughout each day for 3 weeks, I realized I'd accidentally abandoned my food preferences and things that used to trigger disgust in me no longer did. This effect included enjoying types of music/activities/people I used to hate. I came to regard like/dislike as judgement of something as good/bad attached to feeling enjoyment/disgust. I chose to believe I could enjoy anything I put in my mouth. Yes, even that thing you're thinking. No, I probably haven't tried it.
A question I've gotten about this is: "aren't you afraid you'll start enjoying how rotten things taste?" I can still taste when something's gone bad if I've learned to associate the taste with being rotten, but ultimately if I don't know how to identify if something's rotten I then get an opportunity to see if my system can handle it. I can explore the flavors and textures without swallowing, also, if I don't want to get sick. I can also search for how to identify it. Identifying if something is rotten requires learning stuff; it's not innate to us, or, if it is, there's another way to go about things.
I enjoy my life much more after doing this.
Other useful language hacks:
1. Learning to express myself in the pattern of nonviolent communication: When I <observation verb> <non-judgmental observation>, I feel [<emotion(s) without evaluations of others> because I need <need(s)>]. Would you be willing to do _____ to help me meet my need(s) for _____?
I used square brackets [] to indicate a list. Emotions with evaluations of others are things like "offended" or "used" or "abused." Each of these implies someone else did something. The idea is to identify what they did in the first portion & separate it from the emotions that followed. This approach is a practice in leaving behind the thoughts I attach to emotions. I highly recommend reading "Nonviolent Communication" to get more clarity on this.
2. Using "I" statements to describe my experiences, as opposed to "you" or "we" statements. By doing this, I maintain my independence.
3. I started using active voice instead of passive voice. In writing this comment, I noticed a few sentences were in passive voice (including this one, which I chose to not change). An active voice version of the previous sentence would be "In writing this comment, I noticed I'd written a few sentences in passive voice." Using active voice helps me take more responsibility for my actions, instead of blaming others.
4. I choose to use "responsibility" instead of "blame" or "fault." I define blame/fault as assigning 100% responsibility to a person or group of people. How people learn to choose their actions is, in part, a function of how they were raised, the cultures they've been exposed to, the rest of their environment. Those things are largely out of their control. Blame/fault denies these facts. I find it easier to take responsibility for my actions and find ways to improve using this concept. It also allows me to more easily see how others contributed to something, which leads me to holding them
accountable by expressing gratitude for their part and asking them to either continue doing what they did or to make a change (see: Nonviolent Communication requests)
5. The previous trick is a way I specifically apply the idea that no complex system can be 100% accounted for. Apply it to everything.
6. This goes back to non-judgment. Problem/solution resolved into "learning opportunities." This, combined with an understanding of human needs (see: Nonviolent Communication), led me to realize every moment is meeting some needs of mine. At the very minimum, there's the need for opportunities to learn how to better identify and meet my needs. Since I cultivated gratitude for learning opportunities, the previous sentence means I can also meet my need for gratitude of the present moment, which meets my need for joy. Applying these meets my need for connecting with myself. So every unpleasant moment can now be enjoyable, which lays the foundation for figuring out what else I can learn from what's going on.
I could write more, but that's why I'm writing a book on this stuff. The other things are a bit trickier and require preparation for safety's sake because it's possible to get my brain/mind to do really kooky things. If I wasn't properly prepared, I could reconfigure myself in a way that doesn't sustainably contribute to my life. So I'm not sharing that stuff at the moment.
Would love to check out your book whenever you release it! I think a lot of what you do is similar to the Hindu/Buddhist approach of accepting the moment for what it is, and figuring out what you can learn from every moment and experiencing it without too much emotion attached to it. But you seem to have more practical advice on how to accomplish that, which I look forward to hearing.
If you have any specific questions, I'm available to either share what I've come up with if I think it's applicable or to suggest ideas for things to try. I have a sort of programmatic model of the human I'm playing with, so I consider these practices to be human programs. Trying to generate programs for people that have the desired effects is the only way I have to test the model right now, so it'd be mutually beneficial :)
Oh, wow, I realized I started doing 2) a couple of months ago, because one of my coworker is thinking in completely different ways about code in general, which forces me think in completely new and different ways. I already questioned my judgement about "good" or "bad" and started thinking in terms of "different" and I'm truly starting to believe it.
Thanks for the bits about eikasia, etc... Never heard about them before. About awareness, thinking of the fact that if someone has formed a perception about something, his perception being wrong or right doesn't change the fact that he has formed a perception and the fact that his perception exists, so there may exist at least some subtle particle of correctness in his perception, <next is the derived thinking that I have to avoid:> ...and this is a clue about something to be discovered and I have to think and analize...<> Ultimately, Be aware that the example's person perceptions are just perceptions make us aware that all our perceptions are just perceptions. No need even to go to Buda's and others theorie's pointing that everything is result of a chain of happenings, rolling since before dinosaurs and all, cause-and-effect-chain that you are just part and all your perceptions about everything are just result of what the chain has put in you, so ultimately you have no real independent wish and "you" don't even exist as an independent entity, if you consider "you" is your mind with all your perceptions in it.
I like this advice a lot, but the references and the formalized taxonomic names are a severe hindrance. Do you know any description of these ideas that uses more colloquial, modern terms for the taxonomy?
Sorry I was ambiguous in my first post. I meant to ask for reading material that synthesizes these different individual topics together into a more modern taxonomy of concepts that don’t need any reliance on these existing taxonomic terms at all.
I’m not seeking background reading on these concepts as they are already bundled into their own taxonomy, that’s the thing I find unhelpful and am seeking to avoid in favor of some other sources that put them into a more modern framework that obviates any need to reference these specifically except perhaps as historical footnotes or extra reading.
Really, it is the modern terminology, despite being over 2000 years old. Western philosophy is built almost entirely on a sequence of ideas dating back to Plato (and ancient Greek philosophy in general). Modern philosophers analyze the key concepts identified by Plato and his peers, whether they agree with Plato (or Aristotle, or Diogenes, or other writers of the era) or not.
Modern philosophers tend to be extremely dense and difficult to read. What I'd recommend is starting from the beginning - key works of Plato, such as Apology (the trial and death of Socrates), Republic (an ideal government, and Symposium (a friendly conversation about the topics of the day). These are all quite readable and enjoyable, and frankly as good as or better than most modern works.
I'm not aware of any, but I have thought quite a lot about creating a course on this subject. The key issue for me is that modern academic philosophy is perhaps more properly "philosophology", or the study of philosophy. It would be interesting to study and share more practical applications of philosophy, the way the great ancient philosophers perceived philosophy themselves (see John Sellars' The Art of Living for an in-depth analysis of the Greeks' perception of philosophy as a "techne ton bion", or perhaps "lifecraft").
This also tripped me up for a long time. The issue (at least for me) is that modern English is poorly suited to extremely nuanced thought. The best modern language we have today for philosophical discourse is possibly German, and of course Greek and Sanskrit excel at expressing nuance and subtlety. An example of this is the observation that it often takes an entire English sentence or paragraph to express the meaning and shades of meaning that are encapsulated in a single Greek or Sanskrit word.
I agree with you, though, it seems like this language issue is a roadblock to many. The same could be said of mathematics and science, and most of my own learning has revolved around breaking through the initial barrier to entry to a given subject that's in the form of its technical language and notation.
I'm sorry, but I don't care what you do—if you work in tech, you need to be able to demonstrate an ability to think algorithmically in order to solve problems with a computer. It's a) not acceptable to get by in tech without knowing anything about computers and b) not rocket science—anyone can learn!
What you're saying is true -- but at the same time it is a bit awkward that there are things you need to work on for interviews only. For instance, I have never had to use linked lists in an actual dev job (~6y experience), but they come up so often in interviews that you just have to practice things like loop detection
It's a little awkward, but not so bad if it's within basics, like linked lists - which one may be rusty on but will generally know or re-derive - rather than really specific algorithms. But that's just a starting point, then it's about the interview and the communication and the interaction - exactly like the examples above of talking to real users or colleagues.
I was working on LLVM (C++ compiler) for 3 years and Java server backends the other 3 years. Those were the main products, with a side of Python, Groovy, Bash and JavaScript. In all fairness, I suspect LLVM uses linked lists internally for things like instructions in a basic block, but that was a level of abstraction under me.
This is less useful than you think. What you need is the ability to look at a problem and solve it. Understanding the problem is a lot harder than most people (technical experts) realise.
Far too often, the technical expert solves a problem on the basis of their current solution space tools and not what the actual problem requires. In other words, they cannot think outside of the solution space box they are stuck in.
Knowing that there are all sorts of algorithms available for all sorts of variations of problems and knowing where you can get the information to implement a solution to your problem is a lot better than being able to reel off one or two algorithms and not know that there are other solutions.
Over the decades, I have become involved with various projects that were created by "programming guru's". They knew the systems involved and made solutions that showed off their "guru-bility". The problem - well the systems were a nightmare to make changes to. They used "industry standard" practise and supplied a solution that forced people to change how they did their business, without actually considering the business at hand.
One such system was built using dynamically generated SQL. Had the original "guru" actually thought more carefully about the problem in hand, no dynamically generated SQL would have been needed and making changes to the system would not have been difficult. In that particular case, I was there doing "after the fact" technical and functional specs. As part of my package of documentation, I included a specification for rebuilding the entire system so that it would be very easy to maintain and change and the runtime for each system run would have been (on my estimates) reduced to about 5% or less on then runtime of 25 hours.
To get everything back on track generally required rewrites (sometimes complete rewrites) to be able to extend these systems.
It is our responsibility to enhance the end-user experience not make life easy for ourselves.