Anonymous advice (Topic archive) - 80,000 Hours https://80000hours.org/topic/other/anonymous-advice/ Fri, 11 Oct 2024 17:10:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.1 Anonymous answers: What are the biggest misconceptions about biosecurity and pandemic risk? https://80000hours.org/articles/anonymous-misconceptions-about-biosecurity/ Mon, 26 Feb 2024 14:36:38 +0000 https://80000hours.org/?post_type=article&p=85600 The post Anonymous answers: What are the biggest misconceptions about biosecurity and pandemic risk? appeared first on 80,000 Hours.

]]>
This is Part One of our four-part series of biosecurity anonymous answers. You can also read Part Two: Fighting pandemics, Part Three: Infohazards, and Part Four: AI and biorisk.

We rank preventing catastrophic pandemics as one of the most pressing problems in the world, and we have advised many of our readers to work in biosecurity to have high-impact careers.

But biosecurity is a complex field, and while the threat is undoubtedly large, there’s a lot of disagreement about how best to conceptualise and mitigate the risks. We wanted to get a better sense of how the people thinking about these threats every day perceive the risks.

So we decided to talk to more than a dozen biosecurity experts to better understand their views.

To make them feel comfortable speaking candidly, we granted the experts we spoke to anonymity. Sometimes disagreements in this space can get contentious, and certainly many of the experts we spoke to disagree with one another. We don’t endorse every position they’ve articulated below.

We think, though, that it’s helpful to lay out the range of expert opinions from people who we think are trustworthy and established in the field. We hope this will inform our readers about ongoing debates and issues that are important to understand — and perhaps highlight areas of disagreement that need more attention.

The group of experts includes policymakers serving in national governments, grantmakers for foundations, and researchers in both academia and the private sector. Some of them identify as being part of the effective altruism community, while others do not. All the experts are mid-career or at a more senior level. Experts chose to provide their answers either in calls or in written form.

Below, we highlight 14 responses from these experts about misconceptions and mistakes that they believe are common in the field of biosecurity, particularly as it relates to people working on global catastrophic risks and in the effective altruism community.

Here are some of the areas of disagreement that came up:

  • What lessons should we learn from COVID-19?
  • Is it better to focus on standard approaches to biosecurity or search for the highest-leverage interventions?
  • Should we prioritise preparing for the most likely pandemics or the most destructive pandemics — and is there even a genuine trade-off between these priorities?
  • How big a deal are “information hazards” in biosecurity?
  • How should people most worried about global catastrophic risks engage with the rest of the biosecurity community?
  • How big a threat are bioweapons?

For an overview of this area, you can read our problem profile on catastrophic pandemics. (If you’re not very familiar with biosecurity, that article may provide helpful context for understanding the experts’ opinions below.)

Here’s what the experts said.

Expert 1: Failures of imagination and appeals to authority

In discussions around biosecurity, I frequently encounter a failure of imagination. Individuals, particularly those in synthetic biology and public health sectors, tend to rely excessively on historical precedents, making it difficult for them to conceive of novel biological risks or the potential for bad actors within a range of fields. This narrow mindset hinders proactive planning and compromises our ability to adequately prepare for novel threats.

Another frequent problem is appeal to authority. Many people tend to suspend their own critical reasoning when a viewpoint is confidently presented by someone they perceive as an authoritative figure. This can stymie deeper reflections on pressing biosecurity issues and becomes especially problematic when compounded by information cascades. In such scenarios, an uncritically accepted idea from an authoritative source can perpetuate as fact, sometimes going unquestioned for years.

There are various appeals to tradition that frequently skews discussions in broader biosecurity communities. For example, the adage that “nature is the worst bioterrorist” has gained such widespread acceptance that it has become a memetic obstacle to more nuanced understanding. We can be biassed in the data we pay attention to, and in times of unprecedented technological progress, we are at great risk of falling prey to our assumptions and “fighting the last war.”

For example, before the detonation of the first atomic bomb, the idea that a single device could wreak destruction on an entire city was nearly inconceivable based on prior warfare technologies. Previous estimates of damage were based on conventional explosives and did not remotely approximate the catastrophic impact of atomic bombs.

The advent of nuclear technology led to a situation where traditional calculations and historical precedents were starkly insufficient in predicting the risks and outcomes. It would be wise to avoid the same errors of reasoning when thinking about future risks from biology.

Expert 2: Exaggerating small-scale risks and Western chauvinism

There is a vast difference between a disease-causing agent or pathogen and something that could actually create massive harm. That creates bad policies, when people think that they’re the same thing. It’s technically possible for someone to grow anthrax spores in their bathtub, but the amount of damage they could do is constrained by the fact that anthrax typically doesn’t spread between people. So, focusing on the biological agents with the largest capability for harm would probably mean prioritising risks outside of anthrax.

I also think there’s a lot of chauvinism in biosecurity. This is less true for those that work internationally, and more true for people who work in the US. People always think well, of course Johns Hopkins University isn’t going to be part of a biological weapons programme. But that’s not the viewpoint of Russia and China and DPRK. Stop being so jingoistic. Stop thinking Western governments are always well-meaning and non-Western governments are not. The best way to view a US action is: what would you think if you found out that China was researching the same thing?

The other point about chauvinism is that the risks aren’t the same in the rich and the poor world. The rich world ignores the biosecurity of faecal-oral diseases because we have good sanitation infrastructure. Conversely, antibiotic resistance is top of mind for rich countries that use antibiotics as the cornerstone of a biological response, but will not change the consequences of an attack in a poor country that has limited stocks of antibiotics.

Expert 3: Useless reports and the world beyond effective altruism

I think there’s plenty of failure modes in terms of how people go about trying to do good things in the world. Writing a report, sending it into the void, and then expecting impact is a massive failure mode.

Another failure mode is assuming that things that are high status or visible within EA1 are the most effective. A lot of the best work in biosecurity comes from people that most EAs have just no idea exist, and those people should be more of the role models that EAs have.

Expert 4: Overconfidence in threat models

I think some people, including senior people, are overconfident in their threat models and have an overly simplistic view of certain issues. There are also a lot of other people who are under-confident and deferential in a way that I think is bad. So something to do with hierarchy and forming views based on deference is probably a big misconception.

Generally EA biosecurity has failed to really grapple with or seriously weigh the benefits of things like gain-of-function research or pandemic discovery. I think the analysis of these things can be quite simplistic and people tend to form the bottom line first. I think most EAs think it is probably bad to discover new pandemic viruses in nature without thinking too much about it. I also think that’s probably bad, but a lot of people could benefit from more openness to being wrong.

Expert 5: Interventions that don’t address the largest threats

I’m generally just witheringly sceptical of incrementalist biosecurity interventions like just hiring more doctors.

In global catastrophic risk, obviously you do want to reduce risk in an incremental way. But you do it by closing off particular parts of a threat landscape.

I tend to favour interventions that are supported by a story that says something like, but for a given intervention, this particular threat would have killed 10 billion or 8 billion — but instead it only kills less than a billion, or something like that.

COVID was some evidence in favour of my position being correct. The main factor in saving lives was the vaccine. It’s a very heavy-tailed distribution in terms of actual impact. It’s probably an archetypical EA vibe to have, but it’s one I am unapologetic in having.

Expert 6: Over-reliance on past experience and overconfidence about the future

There are two general categories of common failures in most policy efforts. One is based on an over-reliance on past experience and the other is based on a flawed sense of confidence about the future.

For biosecurity and pandemic risk, we shouldn’t overly rely on successful interventions of the past in order to protect us against future biological incidents — such as vaccination and other medical countermeasures. One of the major successes during the COVID-19 pandemic was the rapid development of vaccines. So it’s easy to understand why folks would want to double down on an approach that worked. In fact, it’s an approach that has worked for combatting and eradicating infectious diseases for many decades.

However, medical countermeasures should not be the cornerstone of our playbook moving forward, not least of all because we’ve seen their limitations in preventing disease transmission — plus anti-vaccine sentiments.

Plus, if we think that medical countermeasures are the answer to all of our pandemic problems, we may be motivated to use AI or research automation to design and develop medical countermeasures at any cost — even if the tools and approaches we use could be misused.

On the other hand, a future world where biology becomes increasingly programmable is often imagined by those with confidence and optimism about the future. But it’s this same confidence in biology for good that prevents visualisation of a world where biology can be misused by those with less expertise and resources.

These optimists are often the same ones that believe nothing robust should be done to address dual-use risk because it will limit innovation and its potential life-changing transformations. But it doesn’t quite make sense how innovations through AI and machine learning can be so life-changing for good without commensurate potential to be life-changing for bad. We need to think about minimising the risk with the same level of rigour as maximising the good.

Expert 7: Resistance from governments and antagonising expert communities

I think people overestimate how likely it is that an intervention that prevents global catastrophic biological risks (GCBRs) would be adopted by governments. In my experience, it’s challenging to get interventions that provide measurable, real-world public health benefits funded and adopted — unless they can be proven to pay for themselves and offer regular benefits that can be reported to decision makers.

Interventions that may never be used or only be used very occasionally also can’t be tested for their accuracy and usefulness in a real-world setting. Similar interventions used routinely can fail in unexpected ways or for reasons that weren’t considered, so getting regular feedback based on real-world settings is crucial.

This is especially important in determining whether decision makers and the public trust the information provided, as a lack of trust or confidence can undermine a perfectly good early warning system. I also see people treat GCBR preparedness, public health, and global health interventions as different things when in reality, there are solutions that can be used for multiple purposes and can provide a better case for adoption if considered jointly.

Another issue I see is people positioning themselves in opposition to pathogen researchers, synthetic biologists, AI developers, or people working in public health due to disagreements about how they work and the information they generate. I’ve heard frustrations that these groups won’t engage on these issues from people who talk about them and to them in an extremely negative way.

I’m not at all surprised that these groups aren’t helpful and collaborative when they’re being insulted. These communities offer a wealth of knowledge we desperately need to solve these problems effectively and taking an adversarial stance causes them to respond in kind.

When you engage these groups as collaborators, I’ve found that they are incredibly enthusiastic to get involved and adapt their practices. And when they’re not, it’s because they see a real-world harm as a consequence of caution that the biosecurity community hasn’t recognized or valued enough.

This approach also leaves people in biosecurity re-inventing the wheel when existing solutions and infrastructure could be adapted to be more biosecurity-forward. People working with infectious diseases regularly have a great solution in mind that we could be advocating for instead of trying to come up with something new, having had no experience of dealing with a real outbreak.

It can also lead to biosecurity folks developing an intervention that would fail or not be adopted in practice. There seems to be this thinking that pathogen and public health researchers are too short-sighted to care about GCBRs or aren’t clever/focussed enough to come up with solutions to the problems we worry about. In reality, they’ve often tried and either the intervention didn’t go as well as expected, or they couldn’t get anyone to pay for it.

Expert 8: Preparing for the “Big One” looks like preparing for the smaller ones

I believe some in the effective altruism/longtermism community see a strong disconnect between “ordinary” pandemics of the kind we have recently experienced or smaller, and “existential threats.” While size and severity surely matter, the systems for responding to one are the same ones that will respond to the other — and failure to control a smaller one is the most likely way in which we get to a larger one. So investments in preparing for the “Big One” should primarily (maybe not exclusively) be those that also prepare for the small and medium-sized ones.

Expert 9: ChatGPT is not dangerous

The one that I hear most recently is that ChatGPT is going to let people be weaponeers. That drives me berserk because telling someone step-by-step how to make a virus does not in any way allow them to make a weapon. It doesn’t tell them how to assemble it, doesn’t tell them how to grow it up, it doesn’t tell them how to purify it, test it, deploy it, none of that.

If you lined up all the steps in a timeline of how to create a bioweapon, learning what the steps are would be this little chunk on the far left side. It would take maybe 10 hours to figure that out with Google, and maybe ChatGPT will give you that in one hour.

But there’s months and years worth of the rest of the process, and yet everyone’s running around saying, “ChatGPT is going to build a whole generation of weaponeers!” No way.

Another dubious take is having information hazards rule everything. That we should never do anything to create new information hazards.

To me, that is crazy. Because you basically are saying you’re going to shut off your ability to understand the threat just because you’re worried about possibly informing someone somewhere a little bit more, despite the massive pile of public literature that already exists.

It really ends up being an argument about who’s going to be faster, the weaponeer or the person working on the defensive side. I would much rather have that contest than unilaterally disarm, which is what the information hazard argument insists we do.

I would also argue weaponisation is a years-long timeline.

We have never in human history created a virus from scratch to exploit a new vulnerability in a biological system. I can’t imagine that taking place in less than a couple of years. Over the course of those years the individuals will be throwing off lots of signals that would be detectable.

And so the idea that those weaponeers would race to the finish in six months and have a little tube that they could throw around — I just don’t buy it. I see no evidence of what humans have done so far that would get us anywhere near any of that. And telling ourselves stories about how hard that is or how easy that is, I think is really harmful.

We should at least be realistic in our estimates and judgments of how hard or easy that process would be. And right now, we’re shortcutting all of that and just writing science fiction.

Expert 10: The barriers to bioweapons are not that high

There are a lot of misconceptions out there, in part because the academic community that writes about these things has been very small for a very long time. A lot of the writings that are out there on biological weapons are just wrong.

It’s hard because a lot of it is classified, but a simple example would be the claim that a country would never use a communicable biological weapon because it would blow back on their own people. The fact is that the Soviet Union perfected smallpox as a biological weapon. So the fact that they did it tells me that countries will do it.

One misconception that may once have been somewhat true — but now it’s much less true, and in the future it will be even less true — is that tacit knowledge is required to make biological weapons. That the barrier is so high.

I don’t think that’s the case. I actually don’t think that was the case 20 years ago or 30 years ago, but it’s certainly not the case today.

I think it depends on which specific biological weapon. It’s relatively easy with the information available today for a determined small group or individual to successfully produce and deliver biological weapons. There was a book written about this that studied the Soviet program, and the author spoke to many former Soviet bioweaponeers. I think it’s natural that they would exaggerate their craft and describe how “only people with special skills like me could do this.” I think that biassed the conclusions of that book.

Expert 11: We don’t need silver bullets

Some people think laws and rules, and the tools and technologies used to implement them, do not control bad actors. They only influence the behaviour of good actors. But this misses the deterrent effect of laws and rules.

Some people look only for silver bullets and they discount solutions that might be circumvented. This misses the point that raising barriers to malign misuse of technology can still be highly beneficial.

Expert 12: Three failure modes

A few misconceptions and mistakes:

  1. Failure to engage broadly with knowledge and expertise in the community and trying to reinvent things from scratch.
  2. Failure to account for the fact that biosecurity is exceptionally broad and other people are likely solving for different problems and have different threat models.
  3. Failure to get out into the world and talk with people — most information still isn’t written down.

Expert 13: Groupthink and failing to be scope sensitive

I think that there’s a lack of scope sensitivity. People are not treating the catastrophic potential outcomes in biosecurity as six billion times more concerning — or whatever the right multiplier would be — than the cases where one or two people could get impacted.

I also think there’s some groupthink in the effective altruism biosecurity community, and there is a very important need for people to work in and be exposed to other communities that have some very different perspectives on the issues to understand:

  • Why are these the ideas that the effective altruism community has produced?
  • Why aren’t they being implemented?
  • Why do people disagree with them?

I think that there’s a whole set of sort of feasibility and implementation dynamics that are not well understood by many EAs in this field. I think it’s really important that people expose themselves to the organisations and the people that have been working on this for a long time, the established institutions and scientists, to have a better understanding of why people are doing things the way they do. Sometimes you may still disagree and feel a different approach would be better, but there’s usually a logic to it that is not always fully grasped.

Expert 14: COVID-19 does not equal biological weapons

One common misconception is that because SARS-CoV-2 [the virus that causes COVID-19] was able to cause this horrific human pandemic, any biological weapon would be equally devastating. That this pandemic exposed our vulnerability to biological weapons and that state and non-state actors are immediately going to exploit that vulnerability and attack with biological weapons.

I think that is misleading because this virus evolved to be as infectious and contagious as it was. What Mother Nature is able to do with pathogens, I think, exceeds what humans can still do in this respect.

COVID-19 does not equal biological weapons, bluntly. This is a very specific pandemic that emerged that is not the same as what any country has been able to do in the past or any non-state actor is able to do.

There is also an assumption that because this vulnerability was revealed, lots of states and non-state actors are going to launch biological weapons programmes.

I don’t necessarily think that’s the case. Most countries and terrorist groups don’t have or want or need biological weapons because they will not serve their purposes. We should not expect this dramatic change in calculus based on this one natural pandemic. We’ve had other pandemics historically, lots of flu pandemics that have not resulted in wholesale changes in the way that states or terrorist groups view these things.

Because the reality is biological weapons just don’t work that well and aren’t going to meet the objectives of countries and groups that are looking for new capabilities.

There are lots of misconceptions about how easy it is to develop and use a biological weapon. There are a lot of steps that go into converting a pathogen into an actual weapon that is capable of causing mass casualties, and that process is not necessarily going to be published in the open literature.

It’s not something that people can just learn in grad school, right? There are very arcane knowledge and skills that go into this that are just not readily accessible. So there’s a very common downplaying of the role of tacit knowledge in this process of weaponization. It’s very easy for people to claim that if you can culture a pathogen, and find it in nature, then you can then turn it into a weapon to cause a pandemic or mass casualties.

And that’s just not the case. We’ve seen nation states struggle to develop these weapons and non-state actors have been very unsuccessful with them because they are very challenging to acquire and develop and produce and use properly.

I guess the last misconception I’ll highlight is that advances in life sciences about technology are democratising the proliferation of biological weapons — that this technology, these advances are easily converted into misuse.

And again, that’s just not the case. The technologies that are developed are usually developed for very specific civilian scientific purposes that would need to be modified to be capable of causing mass casualties. That’s not a simple, straightforward, easy process by any means.

People overestimate the degree of threats to biosecurity, which again, I’m not saying that there aren’t any, but I’m saying the tendency is more to exaggerate the threats than not.

And frequently this comes from people who don’t understand the biology of these organisms, and they come out of different epistemic communities. They don’t understand the nuances and the complexities involved in what it takes to actually create a biological threat.

It’s safe to say that the knowledge and capabilities are increasing because of advances in dual-use technologies. But that’s very much a latent capability that’s increasing. It’s another thing to say: OK, this latent capability will transform into something capable of misuse, just assuming that there’s someone out there with an intent and motivation to do so.

That latent capability for modifying pathogens or producing them is clearly expanding in terms of our level of knowledge and the number of people out there who are able to do it. But it doesn’t necessarily mean that the intent is growing. And that’s the part that people kind of lose sight of and don’t pay attention to.

If someone is going to want to misuse biology, there has to be this intent or motivation. And that’s the part that people just kind of take for granted.

Learn more

Top recommendations

Further recommendations

The post Anonymous answers: What are the biggest misconceptions about biosecurity and pandemic risk? appeared first on 80,000 Hours.

]]>
What 80,000 Hours learned by anonymously interviewing people we respect https://80000hours.org/2020/06/lessons-from-anonymous-interviews/ Thu, 18 Jun 2020 14:48:27 +0000 https://80000hours.org/?p=69994 The post What 80,000 Hours learned by anonymously interviewing people we respect appeared first on 80,000 Hours.

]]>
We recently released the fifteenth and final installment in our series of posts with anonymous answers.

These are from interviews with people whose work we respect and whose answers we offered to publish without attribution.

It features answers to 23 different questions including How have you seen talented people fail in their work? and What’s one way to be successful you don’t think people talk about enough?.

We thought a lot of the responses were really interesting; some were provocative, others just surprising. And as intended, they spanned a wide range of opinions.

For example, one person had seen talented people fail by being too jumpy:

“It seems particularly common in effective altruism for people to be happy to jump ship onto some new project that seems higher impact at the time. And I think that this tendency systematically underestimates the costs of switching, and systematically overestimates the benefits — so you get kind of a ‘grass is greener’ effect.

In general, I think, if you’re taking a job, you should be imagining that you’re going to do that job for several years. If you’re in a job, and you’re not hating it, it’s going pretty well — and some new opportunity presents itself, I think you should be extremely reticent to jump ship.

I think there are also a lot of gains from focusing on one activity or a particular set of activities; you get increasing returns for quite a while. And if you’re switching between things often, you lose that benefit.”

But another thought that you should actually be pretty open to leaving a job after ~6 months:

“Critically, once you do take a new job — immediately start thinking “is there something else that’s a better fit?” There’s still a taboo around people changing jobs quickly. I think you should maybe stay 6 months in a role just so they’re not totally wasting their time in training you — but the expectation should be that if someone finds out a year in that they’re not enjoying the work, or they’re not particularly suited to it, it’s better for everyone involved if they move on. Everyone should be actively helping them to find something else.

Doing something you don’t enjoy or aren’t particularly good at for 1 or 2 years isn’t a tragedy — but doing it for 20 or 30 years is.”

More broadly, the project emphasised the need for us to be careful when giving advice as 80,000 Hours.

In the words of one guest:

“trying to give any sort of general career advice — it’s a fucking nightmare. All of this stuff, you just kind of need to figure it out for yourself. Is this actually applying to me? Am I the sort of person who’s too eager to change jobs, or too hesitant? Am I the sort of person who works themselves too hard, or doesn’t work hard enough?”

This theme was echoed in a bunch of responses (1, 2, 3, 4, 5, 6).

And this wasn’t the only recurring theme — here are another 12:

You can find the complete collection here.

We’ve also released an audio version of some highlights of the series, which you can listen to here, or on the 80,000 Hours Podcast feed.

These quotes don’t represent the views of 80,000 Hours, and indeed in some cases, individual pieces of advice explicitly contradict our own.

All entries in this series

  1. What’s good career advice you wouldn’t want to have your name on?
  2. How have you seen talented people fail in their work?
  3. What’s the thing people most overrate in their career?
  4. If you were at the start of your career again, what would you do differently this time?
  5. If you’re a talented young person how risk averse should you be?
  6. Among people trying to improve the world, what are the bad habits you see most often?
  7. What mistakes do people most often make when deciding what work to do?
  8. What’s one way to be successful you don’t think people talk about enough?
  9. How honest & candid should high-profile people really be?
  10. What’s some underrated general life advice?
  11. Should the effective altruism community grow faster or slower? And should it be broader, or narrower?
  12. What are the biggest flaws of 80,000 Hours?
  13. What are the biggest flaws of the effective altruism community?
  14. How should the effective altruism community think about diversity?
  15. Are there any myths that you feel obligated to support publicly? And five other questions.

The post What 80,000 Hours learned by anonymously interviewing people we respect appeared first on 80,000 Hours.

]]>
Anonymous answers: Are there myths you feel obliged to support publicly? And five other questions. https://80000hours.org/2020/06/anonymous-answers-myths-and-other-questions/ Fri, 05 Jun 2020 17:31:35 +0000 https://80000hours.org/?p=69892 The post Anonymous answers: Are there myths you feel obliged to support publicly? And five other questions. appeared first on 80,000 Hours.

]]>

It’s alarming whenever someone says “this is obviously the best thing to do”, when in reality we have very little information in so many spaces.

Anonymous

The following are excerpts from interviews with people whose work we respect and whose answers we offered to publish without attribution. This means that these quotes don’t represent the views of 80,000 Hours, and indeed in some cases, individual pieces of advice explicitly contradict our own. Nonetheless, we think it’s valuable to showcase the range of views on difficult topics where reasonable people might disagree.

The advice is particularly targeted at people whose approach to doing good aligns with the values of the effective altruism (EA) community, but we expect most of it is more broadly useful.

This is the fifteenth and final in this series of posts with anonymous answers. You can find the complete collection here.

We’ve also released an audio version of some highlights of the series, which you can listen to here, or on the 80,000 Hours Podcast feed.

Did you just land on our site for the first time? After this you might like to read about 80,000 Hours’ key ideas.

In April 2019 we posted some anonymous career advice from someone who wasn’t able to go on the record with their opinions. It was well received, so we thought we’d try a second round, this time interviewing a larger number of people we think have had impressive careers so far.

It seems like a lot of successful people have interesting thoughts that they’d rather not share with their names attached, on sensitive and mundane topics alike, and for a variety of reasons. For example, they might be reluctant to share personal opinions if some readers would interpret them as “officially” representing their organizations.

As a result we think it’s valuable to provide a platform for people to share their ideas without attribution.

The other main goal is to showcase a diversity of opinions on these topics. This collection includes advice that members of the 80,000 Hours team disagree with (sometimes very strongly). But we think our readers need to keep in mind that reasonable people can disagree on many of these difficult questions.

We chose these interviewees because we admire their work. Many (but not all) share our views on the importance of the long-term future, and some work on problems we think are particularly important.

This advice was given during spoken interviews, usually without preparation, and transcribed by us. We have sometimes altered the tone or specific word choice of the original answers, and then checked that with the original speaker.

As always, we don’t think you should ever put much weight on any single piece of advice. The views of 80,000 Hours, and of our interviewees, will often turn out to be mistaken.

What’s the worst advice you commonly see people give?

‘You should obviously do this’

People give advice with near 100% confidence. They say “you should obviously do this”. And once someone respected says that “the best thing to do in this field is X” — impressionable people have a tendency to commit to that path, even if they have no reason to think they have the relevant background/skills to succeed.

It’s alarming whenever someone says “this is obviously the best thing to do”, when in reality we have very little information in so many spaces.

A lot of people are both smart and overconfident — and the smarter they are, the more easily they can lead other people astray. The thing they’re working on might not be obviously bad, but if they’re saying it’s obviously the only right approach, and they sound convincing, that can do a lot of damage.

‘You need to get into a priority career track’

I think the framing of trying to get you to focus so heavily on getting into a priority career track is probably overplayed. I don’t think it should be given zero weight, there are many reasons why it’s a good idea to go into a priority career track. If you have two opportunities that look about as good as each other, one in a priority career track and one not, you should go with the priority one. And that’s often the case for many people.

But everytime I hear someone say “I don’t really think I’d be good at this”, or “I don’t really think I’d like it, but I’m going to do it anyway because this is the priority career track that we’re supposed to do”, I think that person is usually making a mistake.

‘What have you achieved this week?’

A lot of people seem to think about career success with very short time horizons — if they haven’t done anything huge or impressive in the last couple of weeks they feel like a failure. So I think a lot of people are going around constantly feeling like a failure, when in reality their performance is perfectly good and their career trajectory overall is perfectly good. It think it would be a good idea to focus less on whether you’ve had a good week, and focus more on whether you’re having a good year.

‘Only do things you love’

Overall you want to get to a point where you love what you’re doing — otherwise you’ll just give up — but there may be specific times where for 2 or 3 years it’s better to accept a kinda horrible project and just muscle through it.

For academia, for example, the key thing is to get tenure. And if that requires sucking it up, and doing things that are less rewarding for a few years, that’s probably the wiser thing to do.

‘Say yes to everything, work at an EA org, don’t worry about networks’

To say yes to everything. Don’t do that — you’ll spread yourself too thinly. Be selective.


Advice that doesn’t include building a strong network.


Within EA — to try to work at an EA org.


Follow your passion!

Do you have any philosophical views that you’d be hesitant to state publicly?

We should cut people more slack

I honestly think that we should cut people more slack than we do. I often watch as an internet mob destroys someone — someone who undeniably did or said something bad — and basically think “There but for the grace of God, go I”. Not because I go around doing or saying things that are bad, but because I’m human and I get frustrated or don’t sleep or whatever, and I snap at people or I say something dumb for no obvious reason. Maybe not as bad as what they did, but certainly something that isn’t a reflection of who I am. But we often seem to think that when we do bad things, it’s usually at least partly because of something outside of our control, like stress, but if someone else does a bad thing, it must be because they’re a bad person. If you take this attitude plus norms around social exclusion that are designed for small groups of people and add the internet to it, I think you get a system that is often overly punitive.

You might think I’m basically saying something like “stop being so mean to people who are caught on camera doing bad stuff and mobbed by the internet” but I think cutting people more slack implies a lot more than that. It implies that we should stop wanting victims to be perfect before we take them seriously, for example. But it’s hard to make the point that even if someone has done a bad thing, maybe that doesn’t mean they should have their life ruined over it without just causing a bunch of people to turn their anger on you instead.

Feminism ought to focus more on practical concerns

I think that movements like feminism tend to focus on the edge of what is possible: like we’ve won so many victories for women, what might it look like to achieve something like true equality for even a small number of women?

But there are so many countries where women have nothing like those kinds of rights. And there aren’t that many abstract, intellectual questions about what needs to happen there to improve equality — they need better healthcare, and more money, and more freedom about whether they have a partner, when they have a partner, whether they have kids, when they have kids. And I worry that when people get so interested in the theory, these practical concerns get kind of lost.

I want people thinking about the cutting edge of feminism, but I don’t want everyone to think that’s where you should necessarily be putting all the resources. Maybe a compromise would be: I think we should put some intellectual resources there, but put most of our money somewhere else.

Factory farming is a moral abomination

It depends on the audience. I think factory farming is a moral abomination, but I won’t always mention that if I’m talking to people who are new to EA. It can just turn someone off completely — you can see them stop listening to you as soon as you mention animals. I’d always mention it if asked directly, but unfortunately it can be better to wait a while before you bring it up yourself.

You should be free to join any tribe you want

I think the ideal world would have no identity you are born with. No pre-assigned culture or race. You would be free to join any tribe you want to.

Also, I believe science is uniquely qualified to describe reality. Other types of knowledge (religious, cultural, revelation) are not as accurate at creating a model.

Some technically true things shouldn’t be said

I think there are things that are technically true — but that people still shouldn’t say because it’s impossible to say them without giving the wrong impression.

If someone says “curing TB is more important than helping hurricane victims”. That might be technically true just given the number of people affected by both. But instead of hearing the message “we both agree about how bad hurricanes are, and this shows how bad TB is”, people will just think that you don’t think helping hurricane victims is important.

We naturally think that the importance of things is zero sum — if you’re saying one thing is really important, you must think the other thing isn’t important. But I don’t think helping hurricane victims is any less important than most people do — I just have to make these decisions to triage where I can do the most good. So even though I think a lot of comparisons between causes are technically true, I think it’s hard to make them in a way that doesn’t do more harm than good.

I think it’s good to focus a little less on whether you’re technically right, and a little more on what your stated views convey to others.

Are there any myths that you feel obligated to support publicly?

‘We have free will’

I don’t go around saying that free will is an illusion, even though that’s what I believe.

‘We should prioritise climate change interventions at the expense of other long-run future causes’

I feel a lot of pressure to say that climate change is an important global catastrophic risk. And if you think that the most important way to do good is to positively influence the long-run future, it’s quite hard to make the case for prioritising climate change interventions. There are some people that say this publicly, but a lot more that think it privately. I don’t support the claim that longtermist EAs should be paying more attention to climate change, but I dispute it much less vigorously than what I think is right.

Speculation

I’m cautious with speculation. There are several things where I think I know what’s going on, and I haven’t said it publicly because there’s a chance I’d be wrong — and it’d be pretty damaging if I were wrong.

‘My field is doing very valuable work’

I certainly tend to talk up my own field — I don’t think I could come and say “well, this is kind of valuable work, but…” You should check in every now and then and think “is the work I’m doing actually valuable?”, but of course you can’t easily say that you’re considering swapping your career path to the people around you.

‘We should follow the conventions of society’

I think ‘myths’ is too strong a word, it implies that something is false. But I think it can be important to support some norms, even when you don’t think they’re fundamentally true.

There are a lot of norms that are just really good given the way the world is, and ones that I’d want to perpetuate.

For example, a norm where people generally follow the conventions of society seems like a good one — even if you disagree with some of those conventions. So, there’s a point at which laws become unjust, very seriously unjust, and then at that point maybe you want to deviate from them. But there’s a tricky zone where you should probably follow rules that you disagree with, just because there is group consensus around them.

What are the biggest flaws of academia?

Not enough focus on the most important questions

Academia is really good at selecting for bright, hard-working people — but doesn’t systematically get them working on the things that are most important. This is partly because there are incentives for them to work on things where they can demonstrate that they’re good and impressive. And that often means making progress on things that other people have been stuck on, which means working on the kinds of problems that people have already worked on a lot. There just isn’t necessarily a market for people to do work which falls between the cracks.

You see this at large scale within academia where work between disciplines is often neglected. But even in specialised fields, you get trendy topics and they attract a lot of work — and more important topics fall through the cracks.


There just isn’t much overlap between the questions tackled by academia and the most important questions. Theoretically, there could be but — at least until you get tenure — there are strong disincentives to work on the most interesting and useful problems. You’re incentivised towards working on the kinds of things that will get you tenure. You can try to find things that overlap, but basically you need to at least partially sell-out for a few years. And that’s hard.

Not valuing effectiveness enough

Being too siloed: You hear lots of people in academia say “we’re open to interdisciplinary collaboration”. And yet when you work in a variety of areas, you get feedback like “so… what do you focus on?” It still seems weird to most people in academia for someone to have diverse research interests, and that’s frustrating.

Not valuing effectiveness: The pressure to raise money also generally makes sense that if you can bring in money then you can do more high impact research. The problem is that the mainstream paths to raising money don’t usually value effectiveness nearly as much as they should.

When I was applying for academic jobs, I didn’t talk about what I really care about. I talked about my mainstream research — and then I introduced what I was hoping to work on after I started the job.

I don’t think the pressure to publish is a flaw, because I think even though it is easier to do EA forum posts, actually going through peer review does bring significant credibility to one’s work inside and outside of EA.

Let’s say the EA community doesn’t exist in 20 years (but there hasn’t been a major global disaster or anything). What’s your best guess for what happened?

A lack of diversity

I think the lack of diversity in the EA community is a big risk. It could lead to some things being said that blow up the movement and make it toxic to associate with or career suicide.

A damaging scandal

My first guess is that there was some really damaging scandal — such that almost no one could afford to be openly associated with EA. But I think it would still exist, just in private.

Causes splintering off

I’d imagine there might be a splintering off. Maybe one of the cause areas, like reducing catastrophic risks, becomes “the thing”. I’m sympathetic to longtermism, but it’s important to be able to check in and also consider “is this actually the right thing we should be working on?”

There’s also a risk in losing influence. No one’s going to listen to the EA movement if it’s just become “all x-risk, all the time” — that just sounds too crazy to the general public.


Different main cause areas get absorbed into their larger fields. Lack of cohesion. Community drama. Decrease in funding. All of that might just mean that there’s no longer this “EA community”.

A gradual degrading in the quality of the people

Probably just a gradual degrading in the quality of the people it attracts. The draw of EA for me was that it seemed like a magnet for all these super smart, thoughtful people who also cared about the world — if it becomes harder for the most talented people to meet each other (that they’re instead meeting people who are just okay), then it will become less attractive for them. The “quality” of the community would then gradually degrade.

The default is entropic, everything kind of degrades. So my intuition is that you need active, conscious injections of energy and effort to keep things at a great level. It would probably exist in some form, but maybe the best people aren’t interested anymore.

Ideas become mainstream

Probably the ideas became mainstream to the point where it didn’t make sense to have a ‘community’.

Lack of sustained motivation

I’d guess lack of continued motivation to participate along longer timelines. The community hasn’t really made many tools for long-term pledges or public accountability.

Not going to happen

Assuming no one comes up with an argument that EA doesn’t make sense, I can’t imagine it going away.

How should people in EA think about having children?

People shouldn’t deny themselves their deepest desires

I think that people shouldn’t deny themselves their deepest desires, in most cases. And so if having a child is a deep desire, I think it’s unhelpful and unhealthy to feel any guilt about having that child.

It’s unlikely that having a child is going to increase your impact, but if you do something that you really care about — like having a kid, or writing a novel — and then feel really guilty about it, you’re encouraging other people to feel the same way.

You can imagine a world where that’s correct — where you want a norm of deep self-sacrifice. But I think that is the wrong norm for EA, and instead we should have norms where maybe being an EA is one of three deep desires — and you just try to do a pretty good job of taking care of all three things.

I think the version of EA where you’re not allowed to pursue any other costly goal is a version that will rightly turn off a lot of the most important people — people who are passionate about a lot of things.

Man, it’s complicated though. Saying this, I notice that I don’t plan to have children so I can focus on having more of a positive impact, and there’s at least a part of me that deeply wants to.


When thinking about this topic, I think it’s important to think about what our goal is as a movement. While a strong utilitarian case can be made against having children, I think the value of seeming as normal as possible outweighs it. Most people want to have children — and it doesn’t seem worth the costs to encourage norms against such a natural desire.

Kids are a way of creating something positive in the world

Children are nice, if EA doesn’t work out at least you’ve created something positive in the world. Having a healthy family life seems potentially really valuable. If you think the future is likely to be good, do you want to be the only one of your lineage who didn’t get things together?

It’s hard to make the case for having more children

If you’re someone who doesn’t have an inclination towards having kids — embrace that. There can be huge benefits. The time saved can be extremely valuable.

That said, if someone wants to have children, they shouldn’t feel like they have to prioritize pure efficiency of not having one. Placing this need aside could have a negative impact and cause the person to resent their presence in EA, maybe being even less effective overall. On the other end, it’s hard to make the case that EAs should be making an effort to have more children. Children are time consuming and expensive.


Because of mean reversion, I think it’s likely that the children of EAs will tend to not be as effective as EAs, so I think EAs should have fewer children to maximize their impact now. As for intelligent people in general, the Flynn effect (IQ increasing over time, though maybe that has stalled?) is larger than the effect due to lower birthrate of higher education people, so I don’t think it’s a huge deal, especially because I think that things are going to change radically in the next 100 years anyway.

It doesn’t matter that much

I’m not sure it matters as much as people seem to think. If you want to have kids, it seems fine to have them. It also seems fine to choose to sacrifice that part of your life by delaying having kids, say, in order to strengthen parts of your career. Or to say “if I wasn’t doing important work I would want to have children — but because I think I am doing important work, I’m happy to make that sacrifice”. You get to make these choices about what you’re willing to sacrifice, and what you’re not.

That’s not to say there isn’t some correct answer to the question if you’re trying to maximize your impact. There’s probably an answer to the question about which hobbies EAs should have too. “Should this particular EA take up woodworking?” — there’s probably an answer to that, but I think we all think there needs to be space for people to make these kinds of decisions based on how it will affect them, rather than the impact it will have on the world.

I think you should probably view having kids as you would other parts of your life that would take up a similar amount of time and money. You shouldn’t assume your kids will have a positive impact, they’re probably not going to go out into the world and spread views about doing the most good (if that’s important to you, you could achieve the goals of future generations sharing your views by writing etc). But that doesn’t matter. I think we shouldn’t try to give an argument for having kids based on impact, because it kind of presupposes that this is how we should make all decisions. People need to have well-rounded lives.

Enjoy this?

Sign up to our weekly newsletter to be notified about future entries in this series, and other new research:

Learn more

Other relevant articles

  1. Your career can help solve the world’s most pressing problems
  2. All the evidence-based advice we found on how to be successful in any job
  3. Find a high impact job on our job board
  4. Career advice I wish I’d been given when I was young

All entries in this series

  1. What’s good career advice you wouldn’t want to have your name on?
  2. How have you seen talented people fail in their work?
  3. What’s the thing people most overrate in their career?
  4. If you were at the start of your career again, what would you do differently this time?
  5. If you’re a talented young person how risk averse should you be?
  6. Among people trying to improve the world, what are the bad habits you see most often?
  7. What mistakes do people most often make when deciding what work to do?
  8. What’s one way to be successful you don’t think people talk about enough?
  9. How honest & candid should high-profile people really be?
  10. What’s some underrated general life advice?
  11. Should the effective altruism community grow faster or slower? And should it be broader, or narrower?
  12. What are the biggest flaws of 80,000 Hours?
  13. What are the biggest flaws of the effective altruism community?
  14. How should the effective altruism community think about diversity?
  15. Are there any myths that you feel obligated to support publicly? And five other questions.

The post Anonymous answers: Are there myths you feel obliged to support publicly? And five other questions. appeared first on 80,000 Hours.

]]>
Anonymous contributors answer: How should the effective altruism community think about diversity? https://80000hours.org/2020/04/anonymous-answers-diversity/ Mon, 27 Apr 2020 17:00:58 +0000 https://80000hours.org/?p=69576 The post Anonymous contributors answer: How should the effective altruism community think about diversity? appeared first on 80,000 Hours.

]]>

I genuinely think you get snowball effects from the composition of a community. If a group of people are similar in a certain way, you’re more likely to get more people like that. I think you can end up in bad situations where you have basically no people from X group in the movement, and then that maybe means you can’t ever get people from X group in the movement. That seems really bad to me.

Anonymous

The following are excerpts from interviews with people whose work we respect and whose answers we offered to publish without attribution. This means that these quotes don’t represent the views of 80,000 Hours, and indeed in some cases, individual pieces of advice explicitly contradict our own. Nonetheless, we think it’s valuable to showcase the range of views on difficult topics where reasonable people might disagree.

This entry is most likely to be of interest to people who are already aware of or involved with the effective altruism (EA) community.

But it’s the fourteenth in this series of posts with anonymous answers — many of which are likely to be useful to everyone. You can find the complete collection here.

We’ve also released an audio version of some highlights of the series, which you can listen to here, or on the 80,000 Hours Podcast feed.

Did you just land on our site for the first time? After this you might like to read about 80,000 Hours’ key ideas.

In April 2019 we posted some anonymous career advice from someone who wasn’t able to go on the record with their opinions. It was well received, so we thought we’d try a second round, this time interviewing a larger number of people we think have had impressive careers so far.

It seems like a lot of successful people have interesting thoughts that they’d rather not share with their names attached, on sensitive and mundane topics alike, and for a variety of reasons. For example, they might be reluctant to share personal opinions if some readers would interpret them as “officially” representing their organizations.

As a result we think it’s valuable to provide a platform for people to share their ideas without attribution.

The other main goal is to showcase a diversity of opinions on these topics. This collection includes advice that members of the 80,000 Hours team disagree with (sometimes very strongly). But we think our readers need to keep in mind that reasonable people can disagree on many of these difficult questions.

We chose these interviewees because we admire their work. Many (but not all) share our views on the importance of the long-term future, and some work on problems we think are particularly important.

This advice was given during spoken interviews, usually without preparation, and transcribed by us. We have sometimes altered the tone or specific word choice of the original answers, and then checked that with the original speaker.

As always, we don’t think you should ever put much weight on any single piece of advice. The views of 80,000 Hours, and of our interviewees, will often turn out to be mistaken.

How should the effective altruism community think about diversity?

Distinguish different types of diversity

I distinguish epistemic diversity, from racial and gender diversity. I think these two things are both important, but very different. People sometimes bundle them together, but I think they’re at best weakly correlated.

You get snowball effects from the composition of a community

I think we want to do a lot better than we are currently on racial and gender diversity. Especially better in as visible a way as possible.

I think that’s for three reasons, in order of importance.

One is that, there’s a possibility of being rendered by the social justice movement as toxic. Probably that would be my number one pick for the most likely existential risk to the EA community. They care a lot about racial and gender diversity, so our caring about it a lot too mitigates that risk.

A second is that I genuinely think you get snowball effects from the composition of a community. If a group of people are similar in a certain way, you’re more likely to get more people like that. I think you can end up in bad situations where you have basically no people from X group in the movement, and then that maybe means you can’t ever get people from X group in the movement. That seems really bad to me.

Third is implicit bias. If we have this strong stereotype of what a good EA looks like, maybe there are good potential EAs who don’t fit that stereotype, who we’re missing out on as a result of implicit bias.

Be aware of selecting against people without a social safety net

EA seems extremely non-diverse to me in terms of class, race, and gender. The current advice probably selects against certain groups, like poorer people who don’t have a social safety net.

It’s all well and good to say “don’t give locally” if your family are all well taken-care of, and you don’t know anyone in dire hardship. It becomes a much more demanding requirement if you have a family who require support from you. In those cases I don’t think many people would be comfortable saying, “yeah, just don’t worry about supporting your family”.

So in and of itself I think that selects for people who are sufficiently well-off such that they and the people they care most about are at least taken care of. They have an income they can give to effective charities, or they can afford to spend time thinking about where the best place to give is etc.

Similarly, wanting people to be risk-tolerant with their career — that is a privileged thing to be able to do. Even having the option to go to university selects against some disadvantaged groups.

And there’s probably a snowball effect there, where once you have a sufficiently non-diverse group it’s hard to make it more diverse. It can just become a place where some people don’t feel like they belong, because there aren’t other people like them in the group, and that’s kind of unfortunate.

There’s a question of “is this kind of diversity desirable?” And I think yes, because if you don’t have it, it really suggests that something is wrong. For example, if you’re consistently not seeing women in your community, it’s reasonable to think, “well, there are a lot of women who would agree with these ideas, so why is there such a gender imbalance?”

Lacking diversity is evidence of something bad

I take a pretty strong stance that lacking diversity is evidence of something bad. If you don’t buy that, you have to think the arguments and ideas of effective altruism somehow aren’t compelling to women or racial minorities or other groups that aren’t well represented in effective altruism. If you don’t think the arguments should only be compelling to one gender or race of people, you should stop and ask “why don’t I see the proportion of people I’d expect to see given the nature of the ideas?”

When I find myself to be the only person of my group in the room, I want to leave

For me, diversity is important for several reasons.

First, so that people feel included and can add their own talents. There is an old economics paper which suggests that one of the main historical causes of growth rates is simply population, i.e. that larger countries grew faster while isolated island populations could draw on fewer ideas and were less productive. Without visibly including minority groups, EA will miss out on talent.

Second, diversity is important so that majority-group EAs can promulgate their ideas more successfully. When I see a paper with 10 authors, or even a post on the EA forum or a blog that thanks 10 people, and all of them appear to be male, I won’t read the article. It is a quality signal, and I don’t have enough time to read everything.

It is similar to poor typesetting: it may be a perfectly fine article, with sensible and important points, but if it has sloppy punctuation, bizarre line breaks, or is in a hard-to-read font, I feel justified in not reading it. Not including women is analogous to sloppy punctuation because it signals low effort on the part of the authors, who could with a little extra effort have included some. It is analogous to bizarre line breaks in that it signals being in some isolated bubble, and work done in an isolated bubble tends to be worse in quality.

It is analogous to being in a hard-to-read font because I do not want to put in the extra effort to suffer through reading something that starts off by pissing me off. Good writers — or speakers, or whatever — make the reader’s job easy. Most people will spell check their articles despite the extra effort it might take, because it’s worth it to attract readers and avoid embarrassment. The same principle applies within the EA community.

Finally, I fear that these cycles are self-perpetuating. When I find myself to be the only person of my group in the room, I want to leave. I feel put on the spot, bearing the weight of “representing” my group. Even if no one is thinking it, I’m thinking they are thinking of me as the “minority group” person, and it frankly makes my contributions worse due to that extra stress. I’ve come very, very close to leaving the room on several occasions, and if EA keeps going down this direction I’ll have to find a substitute.

I think it also colours others’ perceptions of me and of the arguments that I make. Sometimes, I find myself talking with a white man, and he’ll think I am wrong about something with such force that, startled, I momentarily conclude I must have been incorrect. Thinking on it later, I will realise I was indeed correct and be very frustrated. This can in principle happen with anyone of any race/gender/class/etc., but in practice, it’s only happened with well-off white men, my impression is that it is based on my minority group status, and I have not been treated so badly in any community except the EA community, even in my deeply imbalanced line of work.

I have literally been asked “what are you doing here?” at EA events by people who don’t know me and assume I’m lost or something. Heck, I’ve been asked this when I was a speaker at the event. The biases can be ridiculous, and the “lesser” ones support the more egregious ones.

Diversity is really important, but maybe for different reasons than you hear

I actually do think diversity is really important. That doesn’t mean I agree with all the reasons people give that it’s important, or that I agree with others’ claims about the magnitude and nature of the benefits.

I think a lot of people make bad implicit arguments about why diversity is good. For example, there’s often this implication that if what you want is lots of perspectives to diversify your thinking, and point out flaws in your thinking, then the best way to get that is to pursue certain kinds of very visible demographic diversity. And I don’t think that holds. I don’t think that makes sense.

And I think a lot of people in the effective altruism community kind of overreact to that — they hear that, and then they just get so suspicious of this whole topic that they’re not listening to the parts of the argument that are actually good.

A lot of social justice people are just incredibly strident, and make a lot of bad arguments — and so it’s hard to listen to them. But I think they have a lot of good points too.

I do think there are genuinely good reasons to have diversity.

One is: if you walk into a room, and you see that everyone is just visibly, noticeably different from you in a particular way, I think you’ll have an intuitive gut-level sense that you don’t belong. And I think that’s important. It’s important to EA (effective altruism), and I think a lot of us who imagine you should just be able to get over it are people who haven’t had to deal with that. If you’re a man, it’s likely you don’t have experiences with female-dominated workplaces. So I think we tend to understate that.

And then there’s a loop, because that is a real thing — and then because it’s a real thing, I think people who see a crowd that has no visible demographic diversity in it, they kind of reasonably infer that whoever’s organising this whole thing just didn’t care. And then that makes them feel excluded in a different way.

So I think a lot of white men feel excluded in a group of white men, because they can tell that no one there was, like them, feeling a little weird about a group that was all white men. And so I think there’s an intuitive sense of exclusion that goes through two levels like that.

And then I do think that a lot of cultures, and communities, and companies naturally just have blind spots, and get confused between ‘what’s the right rational way to live’ and ‘what am I like?’, or ‘what are people similar to me like?’. They get confused between ‘who’s good’ and ‘who’s similar to me?’

In a perfect world, there would be some very well targeted way to fight that. But often when you force yourself to improve your diversity on these very measurable things, race and gender — that is one of the best available ways that I know of, practically, of forcing yourself to confront a lot of your biases.

Let’s say that you have a culture where people are very aggressive and confrontational all the time, and it’s a big time “ask” rather than “guess” culture. You might get used to kind of assuming that this is just the right, good, natural way to be. And that might make your culture less meritocratic, because there are definitely people who have a ton to offer but shut down in that kind of setting. And you might never notice this because you just notice that in your culture, people who have these qualities do well and people who don’t don’t. Well, now say that you force yourself to really prioritise demographic diversity. Now you’re more likely to be forced to deal with people who don’t have all the qualities you’re used to thinking of as natural and good.

And in theory, you could cook up other ways to notice and confront this kind of problem, but in practice I think this is a pretty good candidate for what ends up working. Sometimes when you just try to reach outside your bubble, in the most recognisable, measurable, visible way that you can — what you’ve done is you’ve reached outside of your bubble. And now that you’ve reached outside of your bubble you’re actually learning lessons about how to create an environment that is meritocratic. Instead of pretending to be meritocratic, but actually just biased towards people who are like you in every way.

I think these are three legitimate reasons to care about diversity, and I think you have a problem if your whole community, or your whole culture, or your whole company, is made up exclusively of white men. Or if it’s even overly heavily that. And I think people should care about that.

That’s in addition to the fourth issue, which is PR and brand and reputation — how people perceive you. But if it was just that, I wouldn’t care as much.

There are a lot of people who overplay the whole thing, who make a lot of bad arguments, who kind of demand that every community be exactly proportionate to American demographic percentages, which doesn’t make any sense — that’s not a reasonable goal. But I think there are good people who are turned off by non-diverse communities, because they recognise at some level these legitimate things I’ve talked about, and probably other benefits to diversity I’m not thinking of or articulating.

We shouldn’t discourage rich people from giving to less privileged people

I think it sucks that there’s not more diversity in effective altruism, but it also sucks when people think this means that those effective altruists that are pretty privileged are somehow doing a bad thing when they give to effective causes. For example, the meme of “the white saviour” is supposed to apply to a privileged white person who doesn’t actually care about the cause, or doesn’t know anything about it. But if we start to refer to any rich white person who gives money to less privileged people — even if they do genuinely care about the cause — as a “white saviour”, we will end up thinking that all that rich people should do is just give to the opera or something. If people are privileged, I want them to give away their resources to people that need them more. I’m not going to discourage them from doing that.

I think there were genuinely good arguments behind some of these images that people had, like the white saviour or the slum tourist, it’s just that people aren’t being precise enough about what they actually disagree with. You can’t take the white saviour idea and then say “white people should not try to help anyone.” That just seems like a really bad conclusion.

You can say, here’s what it’s pointing to that is wrong — thinking that you know more about what’s good for someone than they do, or not attempting to gather information, not genuinely caring, just trying to make yourself feel better etc. I think it’s good to point out these things, but that is very different than “white people shouldn’t help anyone”.

If you think there are historical injustices, then people giving a bunch of their time and money seems like one way of correcting that. It’s not necessarily that they’re doing something awesome — maybe they’re just doing something that they’re obligated to do from this privileged position — but then surely it’s still good for them to meet that obligation.

The burden for increasing diversity should fall on people in proportion to their current representation within the group

I think it’s frustrating when the burden for increasing diversity falls on the few members of the group who happen to be women, or in racial minorities. You either think that having more people from these groups is important, and it’s a bad sign if you don’t, or you don’t think that.

And basically I think that if you do think this is important, the burden should fall on people in proportion to their current representation within the group, It’s important to seek out experiences from people that might be an indicator of things that are going wrong, but that’s very different from asking women and minorities to do most of the work to increase the number of women and minorities in the group.

This happens a lot, and is extremely annoying for people in those minority groups.

I think the reason why people are so cautious about this is because they think you can’t understand what it’s like unless you’re in this position. And I think, even if that’s true, you can go find information about what it’s like. Having a survey, or an open call for other experiences you’ve had, and where you think we could improve — that’s a much lower burden compared to say, needing someone from a minority background to go to a committee meeting every week — that’s a huge burden. Or being the officer for diversity, that’s also a huge burden.

People should be rewarded for coming up with their own worldviews

I really wish there was more epistemic diversity. There’s a stereotype of EA, which is: everyone comes from an analytic philosophy or STEM background; everyone reads the same four blogs; if they’re reading outside of EA, let’s say about the causes of the industrial revolution, then they’ll read “A Farewell to Alms” by Gregory Clark, because that’s like the “EA approved” book on the topic.

And you end up in a quite closed epistemic circle. And I think that’s particularly bad for people who are in actual jobs where you’re extremely limited in terms of how much information you can consume. Because EA already produces far too much information for one person to consume, and so you can end up relying solely on that.

So I’d like there to be more historians, I’d like there to be more genuine economists, I’d definitely like there to be people with a hardcore policy/politics background. I just really like it when people try to figure out their own worldview, and that maybe takes years of thought — but then they come up with it, and share it, and I just find that enormously valuable.

In terms of solutions on the epistemic diversity front, one is lessening to some extent peer-updating norms. I think that’s very important when it comes to action, because otherwise you end up with unilateralists.

But a person who puts in the work to come up with their own worldviews can be viewed as “that person with the crazy views”, and kind of ostracised, rather than being rewarded for trying to figure things out for themselves. And I’d definitely like to see a shift there.

There’s a lack of political and age diversity

There’s a lack of diversity. The vast majority of EAs are left of centre: The 2017 survey had 309 EAs identifying as Left, 373 as Centre-Left, 4 identifying as Right, 31 as Centre Right. So that is 19 to 1 left-right, but if we weight Left as 2, Centre-Left as 1, Centre Right as 1, and Right as 2, it is 25 to 1 left to right ratio! If surveys were comparable, this says that, using the more moderate 19 to 1 left to right ratio, EA is more liberal than all but the most liberal types of US college professors: English literature and political science. EA is more liberal than music, performing art, fine art, theology, philosophy, and sociology professors!

Because of the dominance of the left in EA, the areas of diversity they are most focused on addressing are the left of centre issues of gender, race, sexual orientation, etc. But I think lack of political diversity is a big issue as well, and addressing it would produce higher EA growth potential with equal representation than the above diversity that the left emphasises. This is important as well to make sure we are not missing causes that conservatives may be more alert to. For instance, if it were true that fetuses should be valued similarly to born people, abortion would be the biggest source of mortality in the US, and perhaps globally an effective cause for the present generation.

Someone told me that conversations which advocate for open borders immigration are turning conservatives away from EA. We shouldn’t censor conversations like this, but they found that problematic.

How many people who work at EA orgs are openly politically conservative? How many EA org leaders are openly politically conservative?

We could be not thinking about important things because of that.

And probably lack of age diversity is the biggest one in terms of how much bigger EA could become — and the influence of older people is generally higher. I think there are lots of things we can do to become more inclusive to older people, like not using young people acronyms (like PM, IMO, etc), or at least defining them upon first use on a page and not assuming that everyone has a smartphone or uses social media. I think we should also try harder to recruit older people, which might involve more mass media (or going to professional meetups), rather than internet outreach.

One reason we might not be succeeding at recruiting more older people, is because they’re cautious of being associated with EA in case there is some major scandal and then that hurts their reputation.

Religion is a tougher one because it affects people’s values in a way that may be in tension with EA. Generally people think you can’t be whole-person rational and be religious. However, not all EAs are whole-person rational; it is possible to be compartmentalised rational.

Diversifying EA outside of cities I think could bring some valuable perspectives too.

There’s a lot of room for different causes

On epistemic diversity, I don’t think that there’s this extremely tight set of premises that bind people in EA, so there’s a lot of room for different cause areas. Some people are really interested in global development, others in animal welfare or the far-future. Some people are religious, others are non-religious — so the background assumptions can be quite different.

A lot of different views converge on something like a minimal version of EA, in the sense of just “do the most good you can do”. I don’t think that requires utilitarian ethics, for example. And so you should expect some level of epistemic diversity there, and it seems like a good thing to maintain and foster.

But not having epistemic diversity also doesn’t seem that bad if you’re forming a group based on epistemic beliefs — if you’re literally selecting for beliefs. If those beliefs are also highly correlated with some other beliefs and it’s just the case that some of the key premises aren’t shared by people who are, for example, more socially conservative, it’s not an obvious problem to me. It’s good to get your ideas tested, but you can get that from engaging well with outside groups rather than refusing to take a position in order to preserve epistemic diversity.

Something that would be bad even in this case is if there’s a way in which we’re excluding people. If you think it’s a good idea to have a community of effective altruists, and you want that community to be larger, then if you’re excluding people who agree with the ideas for some other reasons, that seems pretty bad and unnecessary.

Enjoy this?

Sign up to our weekly newsletter to be notified about future entries in this series, and other new research:

Learn more

Other relevant articles

  1. Your career can help solve the world’s most pressing problems
  2. All the evidence-based advice we found on how to be successful in any job
  3. Find a high impact job on our job board
  4. Career advice I wish I’d been given when I was young

All entries in this series

  1. What’s good career advice you wouldn’t want to have your name on?
  2. How have you seen talented people fail in their work?
  3. What’s the thing people most overrate in their career?
  4. If you were at the start of your career again, what would you do differently this time?
  5. If you’re a talented young person how risk averse should you be?
  6. Among people trying to improve the world, what are the bad habits you see most often?
  7. What mistakes do people most often make when deciding what work to do?
  8. What’s one way to be successful you don’t think people talk about enough?
  9. How honest & candid should high-profile people really be?
  10. What’s some underrated general life advice?
  11. Should the effective altruism community grow faster or slower? And should it be broader, or narrower?
  12. What are the biggest flaws of 80,000 Hours?
  13. What are the biggest flaws of the effective altruism community?
  14. How should the effective altruism community think about diversity?
  15. Are there any myths that you feel obligated to support publicly? And five other questions.

The post Anonymous contributors answer: How should the effective altruism community think about diversity? appeared first on 80,000 Hours.

]]>
Anonymous contributors answer: What are the biggest flaws of the effective altruism community? https://80000hours.org/2020/03/anonymous-answers-flaws-effective-altruism-community/ Mon, 02 Mar 2020 21:25:35 +0000 https://80000hours.org/?p=68539 The post Anonymous contributors answer: What are the biggest flaws of the effective altruism community? appeared first on 80,000 Hours.

]]>

I’m extremely pro peer-updating in general, but from the perspective of the community as a whole — I’d much rather a lot of people having a lot of personally formed views.

Anonymous

The following are excerpts from interviews with people whose work we respect and whose answers we offered to publish without attribution. This means that these quotes don’t represent the views of 80,000 Hours, and indeed in some cases, individual pieces of advice explicitly contradict our own. Nonetheless, we think it’s valuable to showcase the range of views on difficult topics where reasonable people might disagree.

This entry is most likely to be of interest to people who are already aware of or involved with the effective altruism (EA) community.

But it’s the thirteenth in this series of posts with anonymous answers — many of which are likely to be useful to everyone. You can find the complete collection here.

We’ve also released an audio version of some highlights of the series, which you can listen to here, or on the 80,000 Hours Podcast feed.

Did you just land on our site for the first time? After this you might like to read about 80,000 Hours’ key ideas.

In April 2019 we posted some anonymous career advice from someone who wasn’t able to go on the record with their opinions. It was well received, so we thought we’d try a second round, this time interviewing a larger number of people we think have had impressive careers so far.

It seems like a lot of successful people have interesting thoughts that they’d rather not share with their names attached, on sensitive and mundane topics alike, and for a variety of reasons. For example, they might be reluctant to share personal opinions if some readers would interpret them as “officially” representing their organizations.

As a result we think it’s valuable to provide a platform for people to share their ideas without attribution.

The other main goal is to showcase a diversity of opinions on these topics. This collection includes advice that members of the 80,000 Hours team disagree with (sometimes very strongly). But we think our readers need to keep in mind that reasonable people can disagree on many of these difficult questions.

We chose these interviewees because we admire their work. Many (but not all) share our views on the importance of the long-term future, and some work on problems we think are particularly important.

This advice was given during spoken interviews, usually without preparation, and transcribed by us. We have sometimes altered the tone or specific word choice of the original answers, and then checked that with the original speaker.

As always, we don’t think you should ever put much weight on any single piece of advice. The views of 80,000 Hours, and of our interviewees, will often turn out to be mistaken.

What are the biggest flaws of the effective altruism community?

Groupthink

Groupthink seems like a problem to me. I’ve noticed that if one really respected member of the community changes their mind on something, a lot of other people quickly do too. And there is some merit to that, if you think someone is really smart and shares your values — it does make sense to update somewhat. But I see it happening a lot more than it probably should.


Something I feel is radically undersupplied at the moment is just people who are really trying to figure stuff out — which takes years. So the person I’m mainly thinking about as the kind of paragon of this is Carl Shulman, where he’s spent years and years just really working out for himself all the most important arguments related to having a positive influence in the long-run, and moral philosophy, and meta-ethics, and anthropics and, well — basically everything. And the number of people doing that is very small at the moment. Because there’s not really a path for it.

If you go into academia, then you write papers. But that’s just one narrow piece of the puzzle. It’s a similar case in most research organisations.

Whereas just trying to understand basically everything, and how it all fits together — and not really deferring to others, and actually trying to work out everything yourself, is so valuable. And I feel like very few people are trying to do that. Maybe Carl counts, Paul Christiano counts, Brian Tomasik counts, I think Eric Drexler as well.

If you’re someone who’s considering research in general, I think there’s an enormous value here because there’s just so few people doing it.

I think there are plenty of people who are intellectually capable of this, but it does require a certain personality. If we were in a culture where having your own worldview — even if it didn’t seem that plausible — was an activity that was really valued, and really praised, then a lot more people could be doing this.

Whereas I think the culture can be more like “well, there’s a very narrow band of super-geniuses who are allowed to do that. And if you do it, you’re going to be punished for not believing the median views of the community.”

I’m extremely pro peer-updating in general, but from the perspective of the community as a whole — I’d much rather a lot of people having a lot of personally formed views. I feel like I learn a lot more from reading opinions on a subject from ten people who each have different, strong, honest views that they’ve figured out themselves, rather than ten people who are trying to peer-update on each other all the time.


Everyone’s trying to work at effective altruism (EA) orgs.

Too many people think that there’s some group of people who have thought things through really carefully — and then go with those views. As opposed to acknowledging that things are often chaotic and unpredictable, and that while there might be some wisdom in these views, it’s probably only a little bit.

Disagreeableness

I’m concerned that some of the social norms of EA are turning people off who would otherwise find the ideas compelling. There’s such a norm of disagreeableness in EA, it can seem like every conversation is a semi-dispute between smart people. I think it’s not clear to a lot of people who have been around EA for a long time just how unusual that norm is. For people new to EA, it can be pretty off-putting to see people fighting about small details. I don’t think this problem is obvious to everyone, but it seems concerning.

Too much focus on ‘the community’

Sometimes it isn’t that fun to be around the EA community.

I’d much prefer an emphasis on specific intellectual projects rather than a community. It sometimes feels like you’re held to this vague jurisdiction of the EA community — are you upholding the norms? Are you going to be subject to someone’s decision about whether this is appropriate for the community? It can seem like you’re assumed to have opted in to something you didn’t opt in to, something that has unclear norms and rules that maybe don’t represent your values.


I think sometimes people are too focused on what the community does, thinks, etc. What you’re doing shouldn’t depend too much on what other people are doing unless you personally agree with it. If the effective altruism community ended tomorrow it honestly wouldn’t affect what I’m doing with my life – I do what I do because I think the arguments for it are good and not because the effective altruism community thinks it’s good.

So I think the ideas would survive the non-existence of the community. And I think we should generally focus on the ideas independently (though if you really value the community, I understand why that might be important).

A ‘holier than thou’ attitude

Something that seems kinda bad is people having a ‘holier than thou’ attitude. Thinking that they’ve worked out what’s important, and most other people haven’t.

But the important part of EA is less the answers we’ve arrived at, and more the virtues in thinking that we’ve cultivated. If you want other people to pick up on your virtues, being a jerk isn’t the best way to do it.

Failing to give more people a vision of how they can contribute

I don’t think EA ever settled the question of “how big a mass movement does it want to be”? We raised a lot of good points on both sides, and then just ambivalently proceeded.

If we want to be a mass movement, we’re really failing to give average people, and even some well-above average people, a vision of how they can contribute.

A lot of people get convinced of the arguments for longtermism, and then encounter the fact that there aren’t really good places to donate for far-future stuff — and donating money is the most accessible way to contribute for a lot of people.

I worry that this creates a fairly large pool of money that may actually end up being spent on net-negative projects, because it’s just floating around looking for somebody to take it. That creates conditions for frauds, or at the very least for people whose projects aren’t well thought through — and maybe the reasons they haven’t received funding through official sources yet are good ones.

But there are a lot of people who want to help, and who haven’t been given any good opportunities. If we want to be a mass movement, I think we’re really failing by being too elitist and too hostile towards regular people.

We’re also not giving good, clear ways to donate to improving the far-future. I think that even if you’re convinced by the arguments for longtermism, unless you have a really good reason to think that a particular giving opportunity is going to be underrated by the institutions that are meant to be evaluating these things — you should consider donating to animal welfare or global development charities. Both of which are very important.

The arguments for why those causes are important are not undermined by the possibility of short AI timelines. If anything, saving someone’s life is a bigger deal if it means they make it to the singularity. It’s fine to say, “yep, I’m persuaded by these long-term future arguments, but I don’t actually see a way for my money to make a difference there right now, so I’m going to make donations to other areas where it’s clearer that my donation will have a positive effect.”

The community should be more willing to say this. I don’t think I’m the only person convinced by longtermism arguments who doesn’t think that a lot of people should donate to longtermist stuff, because there just aren’t that many good giving opportunities. People can be unwilling to say that, because “we don’t want your money” can sound snobby etc.


Deemphasizing growth. One way of countering lock-in in the media is to have new media stories with additional facets of EA. I think there are a lot of problems that would be great to have more EAs working on and donating to. EAs have expressed concern that recruiting more people would dilute the movement in terms of ability. But I think that it is okay to have different levels of ability in EA. You generally need to be near the top to be at an EA organisation or contributing to the EA forum. But if someone wants to donate 10% of their money to a charity recommended by EA, and not engage further, I think that’s definitely beneficial.


I’d like to see a part of EA devoted to a GiveWell-type ranking of charities working on the reduction of global catastrophic risks.

Longtermism has become a status symbol

Believing the arguments for longtermism has become something of a status thing. A lot of EAs will tend to think less of people if they either haven’t engaged with those arguments, or haven’t been convinced. I think that’s a mistake — you have to create conditions where people don’t lose respect for disagreeing, or your community will predictably be wrong about most things.

Not engaging enough with the outside world

I worry about there being an EA bubble — I’d like to see more engagement with the outside world. There are some people who aren’t ever going to be convinced by your view of the most important things, and it’s fine to not worry about them.

At the same time, there’s a risk of getting carried away talking with people who they really agree with — and then trying to transfer that to the rest of their career. They might say things at work that are too weird, or they might make overly risky career decisions that leave themselves without backup options.

Not following best hiring practices

There are some incompetent people in prominent positions at EA organisations — because the orgs haven’t put enough time into studying how to best find successful employees.

EA orgs should study best hiring practices. If a role is important, you need to get the right person — and that shouldn’t be on the basis of a cover letter, a resume and an interview. Everybody involved in hiring should read Work Rules!, and people should be implementing those principles.

Being too unwilling to encourage high standards

I think it does make sense to have messages for highly involved EAs to make sure they don’t burn out. However, this should probably be more in person rather than online, as these people are typically in in-person EA communities anyway. The large majority of EAs are not giving 10% of their money or not changing their career radically or working themselves to the bone, so I think they should be encouraged to meet high standards. I think we can keep our standards high, such that you donate 10% of your money, or do direct effective work, or volunteer 10% of your free time (roughly 4 hours a week) to EA organisations or maybe just promoting EA individually. I think EA can still grow much faster even with these high standards.


I don’t know if we should have the norm that donation should end when retirement starts. But maybe it was an appropriate compromise to not have it be too intimidating.

Doing non-technical research that isn’t actually useful

I’m sceptical of most forms of non-technical EA-ish research being practically useful.

I think there’s a few people who do excellent macro strategy research, like Nick Bostrom — but there’s a norm in the EA community of valuing when someone comes up with a new cool consideration or an abstract model that relates to an EA topic, and I think most of that work isn’t actually valuable. It’s the sort of thing where if you’re not exceptionally talented, it’s really difficult to do valuable work.


There can be a temptation among EAs to think that just writing considerations on interesting topics is the most useful thing that they could be doing. But I often see write-ups that are overly general, not empirically grounded enough, that only a few people are going to read — and of the people who read it none are likely to update their views as a result.

People can feel like if they write something and put it up on the internet that equals impact — but that’s only true if the right people read it, and it causes them to change their minds.

Abandoning projects too quickly

Often people don’t commit enough time to a project. Projects can be abandoned after 6 months when they should have probably been given years to develop.

Most people live in the centre of big cities

I think it’s a problem that the important organisations and individuals are mostly in EA hubs. This is especially problematic because all the EA hubs are in NATO cities, which likely would not survive full-scale nuclear war. A simple step to mitigate this problem is living in the suburbs or even outside the suburbs, but I think EAs have a bias towards city life (there is already a gradient in rent representing commuting costs, so if you actually think there is significant chance of nuclear war, it makes sense living outside of metros, especially if you can multitask while commuting). Even better would be locating outside NATO countries in ones such as Australia or New Zealand (because of lower pandemic risk as well).

Lack of support for entrepreneurs

I’d love to see someone create a good EA startup incubator. I don’t think anyone’s doing it well at the moment.

One of the biggest problems with EA is a lack of entrepreneurs that are ready to start a project on their own. But if we could get some of the best EAs to commit to allocating some of their time systematically to help people with the best proposals — get their new projects, or orgs ready to go — I think that would be the most effective way to utilise the resources we currently have at our disposal.

Valuing exceptional work in a non-effective job too highly

Many EAs have said that if one is building career capital in a noneffective job, you have to be an exemplary performer in that job. But I think that that takes so much effort that you are not able to develop background knowledge and expertise towards your actual effective work. One example is working hard for bonuses; in my experience, the marginal dollar per hour is very low for bonuses.

Too cautious

Maybe slightly too cautious overall. I understand the reasons for focusing on possible negative consequences, but I think generally I’m more pro “doing things”.

Too narrow

Thinking about the way that you are putting things, and the tone that they have, is very important. But it’s one of those things where people can fail to acknowledge the importance of it.

People who disagree with an idea find it very hard to say “I disagree with this, but I don’t quite know why”. It’s also very hard to say, “the thing is, I don’t really disagree with any of the claims you made, but I really do disagree with the way they were made, or what they seem to imply”.

I suspect when it comes to a lot of the criticisms of EA, people will try to present them as disagreements with the core ideas. And I think a lot of the people making these critiques don’t actually disagree with the core ideas, they’re really saying “it feels like you’re ignoring a bunch of things that feel important to me”.

So I would like to see EA grow, and be sensitive to those things. And maybe that means I want EA to be broader, I think I probably do. I would like there to be more people who disagree. I would like there to be more people who won’t present things in that way. It would be nice to see more moral views presented; I think these ideas are not restricted to the groups that are currently dominantly represented in EA. And so I think an epistemically virtuous version of EA probably is broader, in terms of actually gathering, and being compelling to people with a range of different views.


I think there is a bias in the existential risk community towards work at global top 20 universities. Something like 90 of percent of the work gets funded there, compared to general research where it might be about a couple percent in those universities. You could argue that for some problems you really need the smartest people in the world. But I think that lots of progress can be made with people not at those elite universities. And it is a lot cheaper at other universities.

Neglecting less popular funding opportunities

I think one mistake is Good Ventures not diversifying their investments (last time I checked, I think nearly all was still in Facebook).


There are still funding gaps that aren’t necessarily always recognised. There’s talk about earning-to-give being deprioritised, but that only makes sense for higher-profile EA cause areas. For areas that aren’t popular at all in the mainstream world — EA funding is essential. There are a lot of exciting projects that just don’t get done purely because of funding gaps.


I think Open Philanthropy putting $55 million into something [CSET] that is not even focused on transformative AI, let alone AGI was not a good idea considering all the other GCR reduction opportunities there are.


There are really large funding gaps both for existing and EA-aligned organisations yet to be funded. When a group gets funded, it also doesn’t mean they were able to get full funding. It can also be challenging to learn about all the different EA organisations as there’s no central hub. Lists are very scattered and it can be challenging for the community to learn about them all and what their needs are.

A lack of focus on broader global catastrophic risks

I think a common mistake long-term future EAs make is that existential risk means only extinction. In reality, there are many routes to far future impact that do not involve extinction right away.


I’ve heard a number of long-term future EAs express skepticism that any GCR interventions could actually be net beneficial to the present generation. However, there was the book Catastrophe: Risk and Response that made just this argument. Also, there are models showing both AGI and preparation for agricultural catastrophes are both highly cost-effective for the long-term future and for the present generation.

Being too siloed

I think EA is a little too siloed. I think it is useful to take into account impacts on multiple cause areas of particular interventions, like GCR interventions saving lives in the present generation.

I think it is great that EAs are proposing a lot of possible Cause Xs, but I would like to see more Guesstimate cost-effectiveness models to be able to evaluate them.

Not media savvy enough

EAs should try to be more media savvy. This applies to avoiding misconceptions around topics, earning-to-give etc.

But EAs should also recognise the importance of telling a good story. For longtermism, this is particularly hard. Showing a video of a starving child tugs on the heartstrings, but how do you do that for future generations? How do you do that for AI safety? I think EAs could spend more time thinking about how to communicate this stuff so that it resonates.

Also focus on the positives. That everyone can be a hero. If you focus on guilt, people switch off.

When I tell people that we’re trying to avoid catastrophic risk, they always think I’m talking about climate change.

How can EA better communicate that climate change isn’t the only big risk?

Enjoy this?

Sign up to our weekly newsletter to be notified about future entries in this series, and other new research:

Learn more

Other relevant articles

  1. Your career can help solve the world’s most pressing problems
  2. All the evidence-based advice we found on how to be successful in any job
  3. Find a high impact job on our job board
  4. Career advice I wish I’d been given when I was young

All entries in this series

  1. What’s good career advice you wouldn’t want to have your name on?
  2. How have you seen talented people fail in their work?
  3. What’s the thing people most overrate in their career?
  4. If you were at the start of your career again, what would you do differently this time?
  5. If you’re a talented young person how risk averse should you be?
  6. Among people trying to improve the world, what are the bad habits you see most often?
  7. What mistakes do people most often make when deciding what work to do?
  8. What’s one way to be successful you don’t think people talk about enough?
  9. How honest & candid should high-profile people really be?
  10. What’s some underrated general life advice?
  11. Should the effective altruism community grow faster or slower? And should it be broader, or narrower?
  12. What are the biggest flaws of 80,000 Hours?
  13. What are the biggest flaws of the effective altruism community?
  14. How should the effective altruism community think about diversity?
  15. Are there any myths that you feel obligated to support publicly? And five other questions.

The post Anonymous contributors answer: What are the biggest flaws of the effective altruism community? appeared first on 80,000 Hours.

]]>
Anonymous contributors answer: What are the biggest flaws of 80,000 Hours? https://80000hours.org/2020/02/anonymous-answers-flaws-80000hours/ Fri, 21 Feb 2020 18:04:57 +0000 https://80000hours.org/?p=68492 The post Anonymous contributors answer: What are the biggest flaws of 80,000 Hours? appeared first on 80,000 Hours.

]]>

I think we’re currently too far in the direction of “there’s like 10 legitimate career paths for people involved in effective altruism, and no others”. That’s crazy, there are dozens and dozens of career paths where you could make a difference. A huge difference — you might make the difference.

Anonymous

The following are excerpts from interviews with people whose work we respect and whose answers we offered to publish without attribution. This means that these quotes don’t represent the views of 80,000 Hours, and indeed in some cases, individual pieces of advice explicitly contradict our own. Nonetheless, we think it’s valuable to showcase the range of views on difficult topics where reasonable people might disagree.

The advice is particularly targeted at people whose approach to doing good aligns with the values of the effective altruism (EA) community, but we expect most of it is more broadly useful.

This is the twelfth in this series of posts with anonymous answers. You can find the complete collection here.

We’ve also released an audio version of some highlights of the series, which you can listen to here, or on the 80,000 Hours Podcast feed.

Did you just land on our site for the first time? After this you might like to read about 80,000 Hours’ key ideas.

In April 2019 we posted some anonymous career advice from someone who wasn’t able to go on the record with their opinions. It was well received, so we thought we’d try a second round, this time interviewing a larger number of people we think have had impressive careers so far.

It seems like a lot of successful people have interesting thoughts that they’d rather not share with their names attached, on sensitive and mundane topics alike, and for a variety of reasons. For example, they might be reluctant to share personal opinions if some readers would interpret them as “officially” representing their organizations.

As a result we think it’s valuable to provide a platform for people to share their ideas without attribution.

The other main goal is to showcase a diversity of opinions on these topics. This collection includes advice that members of the 80,000 Hours team disagree with (sometimes very strongly). But we think our readers need to keep in mind that reasonable people can disagree on many of these difficult questions.

We chose these interviewees because we admire their work. Many (but not all) share our views on the importance of the long-term future, and some work on problems we think are particularly important.

This advice was given during spoken interviews, usually without preparation, and transcribed by us. We have sometimes altered the tone or specific word choice of the original answers, and then checked that with the original speaker.

As always, we don’t think you should ever put much weight on any single piece of advice. The views of 80,000 Hours, and of our interviewees, will often turn out to be mistaken.

What are the biggest flaws of 80,000 Hours?

People need to think through these questions for themselves

I think 80,000 Hours generally give pretty good career advice, but I’m worried people take it as gospel — because 80,000 Hours have kind of cornered the market on giving career advice to people in the effective altruism (EA) community. But it’s important to make it clear that these questions are really hard, and smart people disagree about the answers, and you actually need to think it through for yourself — though you still might get it wrong.

A lot of people feel like 80,000 Hours is not talking to them

I know a lot of people who are fairly competent people who donate a lot of money, participate in online discussions, and overall are part of the ecosystem that allows us to develop good ideas. They feel like 80,000 Hours is not talking to them at all, has no advice that is applicable to them — they have this perception that 80,000 Hours is for people who are vastly more impressive than they are. This can make them sad, and it can drive them away from EA a little bit. Certainly if they don’t have a strong EA network they can bounce away from EA. If it’s the case that we should be a mass movement, this seems like a problem.

I’d be excited about an EA movement that encouraged lots of people who weren’t able to contribute directly right now to get a regular job, and that provided those people with a lot of thought-provoking resources they could engage with during their spare time. Some of them can then blossom into people who can do direct work.

It’d also be great to have the perception that EA really valued the people who were participating in building the ecosystem of the community, while working their regular job as a flight attendant or police officer etc. Jobs like that allow you to save enough money to save a few lives per year, and leave you with a lot of mental energy at the end of the day to pursue your intellectual interests.

If 80,000 Hours stopped being public facing almost entirely, and just focused on reaching out to elite people, connecting them with the best jobs, that would be valid. But I think there’s a problem if the content is public, but a lot of people who read it come away thinking that there isn’t any way they could possibly contribute anything of value to this movement.


I am very concerned about intelligent people being turned off by 80,000 Hours’ elitism. I have argued that there is a lot of good content that 80,000 Hours has developed that would be useful to the typical college graduate. I understand the focus on maximizing impact with especially high-impact career changes, but if EA gets a reputation of being elitist, that could compromise future growth potential. So I don’t think it is a good idea to rewrite the career guide for an even more advanced audience. It seems like there should be some non-offensive way of routing different ability (or dedication?) levels to different parts of the website.

I am also concerned about deemphasizing earning to give, because I think there are lots of fledgling organizations that would benefit from more money. Also, earning to give is the most accessible and scalable effective career, because people can do it without even switching careers. Though it is true that current giving in EA is dominated by billionaires, that will not necessarily continue to be the case. If people in developed countries were chosen randomly and all gave 10%, it would not actually be dominated by billionaires.

I think 80,000 Hours is assuming too short of job lengths. They originally interpreted the data for younger people of the average time that people have been in their current job of four years as the average time that people stay in their job. But if everyone stayed in their job for 40 years, if you polled everyone, the average person would say they have been in that job for 20 years. Therefore I think the average time that young people stay in a job is about eight years.

Too focused on a few highly legible career paths

Although you’re moving away from this — making it seem like “a career” is a natural unit. Listing careers and then ranking them — I think that’s pretty inaccurate and misleading. The interesting stuff is in the grey space between the careers.

I’d like to see more stuff like the Ben Kuhn post on the EA forum: “Career choice: Evaluate opportunities, not just fields”. Throughout your life you’ll encounter a lot of different specific things that you can do, and the details of the specific things could matter just as much as what cause area or career they fall into. Spend more time thinking about the specific opportunities that may be available to you, compared to the abstract divisions of areas.


Overly focused on a few highly legible career paths.

If I was in 80,000 Hours’ shoes, I would start with just model causes, not career paths. I wouldn’t even refer to “AI safety”, I’d say something like, “the project of causing AI to be built safely and then deployed well” — and then ask, “okay, what’s going to be needed for that cause?”

It might need tons of different things. You obviously need researchers, but you’ll also need managers, you’ll need people who can communicate these ideas, you’ll need people to convince other people, you might need people in government, in information security, you’ll maybe need to build alliances with other countries, you’ll maybe need an intimate knowledge of industry — there are just so many things you’re fairly likely to need.

And so I would prefer to encourage people to develop skills and career capital in relevant areas where they might be great. And that can include something very general, like being a strong writer, speaker, or manager.

I don’t want to take this too far, I think if you can plausibly do AI safety technical research well, you should do it. But I think we’re currently too far in the direction of “there’s like 10 legitimate career paths for EAs and no others”. That’s crazy, there are dozens and dozens of career paths where you could make a difference. A huge difference — you might make the difference.

It’s less worth it for 80,000 Hours to invest a lot into researching a career path that only 3 people can go into effectively — but that means that the kinds of paths where your input would be extremely important can get overlooked. The absolute best career for you might never be able to make a top careers list.

Should be more ambitious in trying to reach the next generation of influencers

I think a strong case could be made for massively expanding 80,000 Hours.

I think the multiplying effect of putting a lot of money into educating the most talented young people could be enormous.

80,000 Hours with a large marketing budget — hiring people just to organise the most influential US campuses. You could have 10 full-time staff, each in charge of organising 10 campuses — writing articles for the school newspaper, handing out leaflets, giving presentations etc. Then have volunteers who want to dedicate 10-15 hours a week. Have a clear programme for what they’re doing. And then the entire next generation of influencers — politicians, philanthropists etc — could understand EA principles and 80,000 Hours specifically. If they ever find themselves sitting on a yacht with a billionaire — they’ll know what to say to them.

Where do the titans of industry and politics go to school? We can answer those questions — and focus on those places. EAs should be hosting events, taking out ads, taking on fellows — I feel like there would be such a big multiplier factor.

There are limits to meme-spreading

I wonder if 80,000 Hours is really thinking about how meme-spreading works, and what memes they want to spread, and understanding the limits to meme-spreading.

I think sometimes 80,000 Hours just tries to write what they think, and then if they want to correct themselves they write a correction, and they’re modelling it as though everyone is going to read their posts, and process all the content. They’re not modelling it the more correct way, which is something like: people will read the headline, and take away some very high-level, low-fidelity thing.

I’m not sure 80,000 Hours are thinking enough about how confident they are in the value of getting a lot of people to believe a very simple thing before they promulgate it.

They make it more likely that people will ignore their gut feelings

Some of the worst decisions I’ve ever made in my career were ones I thought through very carefully. I thought of all the key considerations explicitly, I considered counterfactuals, I looked at my comparative advantage — but they were ultimately bad decisions.

Major things like where to live, or where to go to school… at a gut level it didn’t feel right, but the alternatives just didn’t seem to make a lot of sense.

I think it would have been perfect for me to take time away from school, to explore things. But there wasn’t the social support, or the grants that exist today.

When you get what everyone thinks is a great job offer — it seems like madness to turn it down. To say “I’m actually just going to take some time away, to really think about what I want to do”. But that’s what might have been right for me.

I had a gut feeling that a big decision would make me unhappy, but I couldn’t really say why explicitly. When you’re thinking about everything very carefully, it’s easy to be dishonest with yourself about how important personal happiness is to you.

I’m glad that these career tools exist, but I want people to take any intuitive aversion to a decision seriously. Even if you think really hard and can come up with no way of explaining it. If you just have a feeling that you don’t like a city, but you don’t know why, I think that should be evidence that moving there is a bad idea.

In these circumstances, open up your option space if you can. Think about things that seem totally out of the box, “no one would ever do that”; put that into your option set and see how you feel about it.

It’s not that 80,000 Hours is explicitly bad on this, but the tools it provides makes it more likely that people will make the mistake of ignoring their emotional responses.

It pushes people towards direct roles too early

I think it pushes people towards getting direct roles too early, and that contributes to the problem of people being so concerned about “what roles are open now?”


I worry about encouraging a whole bunch of people to go into jobs when those jobs don’t exist. I worry about overly-general career advice.

It’s not accessible for people from poor backgrounds

I don’t think 80,000 Hours is well designed for people from poor backgrounds. There’s a big group of people that no one is talking to, and if they had access to the right advice, it could make an enormous difference. Opening up 80,000 Hours as a realistic possibility to poorer people could be incredibly valuable.

The definition of ‘earning to give’ is too narrow

I had a problem with your definition of earning to give (ETG) [the part about choosing a job to earn more money in order to give].

I think if someone is mid-career, not willing to make a big change, but wants to donate 10% of their income to effective causes — it seems like we could call that earning to give. But that would be a very different definition compared to 80,000 Hours.

On the EA Survey, you’re asked “how are you an EA? Is it research, other direct work, or ETG?”. But then people have to pigeon hole themselves. If they’re not a researcher, and they don’t do direct work, they have to put ETG.

Editor’s note: 80,000 Hours doesn’t write or manage the EA survey.

Should provide more support for people who aren’t primarily focused on the long-term future

I think 80,000 Hours should provide people who aren’t longtermists with a little more guidance. Even if you overwhelmingly care about the far future, I think there’s value to having a broader base — and if 80,000 Hours better supported people who care about things like global development and factory farming, that will help protect against further narrowing EA into a smaller and smaller group of people.


I think 80,000 Hours is generally doing a great job, and I recommend that people go there. But I want to be able to make those recommendations without having to worry about them becoming demoralised. So many people are introduced to EA by 80,000 Hours, and a lot of those people come in caring about global poverty, or animal welfare. Maybe they’re not ready to make a big change straight away to caring about the far-future — and I think it’s really important that they not be turned away.

Enjoy this?

Sign up to our weekly newsletter to be notified about future entries in this series, and other new research:

Learn more

Other relevant articles

  1. Your career can help solve the world’s most pressing problems
  2. All the evidence-based advice we found on how to be successful in any job
  3. Find a high impact job on our job board
  4. Career advice I wish I’d been given when I was young

All entries in this series

  1. What’s good career advice you wouldn’t want to have your name on?
  2. How have you seen talented people fail in their work?
  3. What’s the thing people most overrate in their career?
  4. If you were at the start of your career again, what would you do differently this time?
  5. If you’re a talented young person how risk averse should you be?
  6. Among people trying to improve the world, what are the bad habits you see most often?
  7. What mistakes do people most often make when deciding what work to do?
  8. What’s one way to be successful you don’t think people talk about enough?
  9. How honest & candid should high-profile people really be?
  10. What’s some underrated general life advice?
  11. Should the effective altruism community grow faster or slower? And should it be broader, or narrower?
  12. What are the biggest flaws of 80,000 Hours?
  13. What are the biggest flaws of the effective altruism community?
  14. How should the effective altruism community think about diversity?
  15. Are there any myths that you feel obligated to support publicly? And five other questions.

The post Anonymous contributors answer: What are the biggest flaws of 80,000 Hours? appeared first on 80,000 Hours.

]]>
Anonymous contributors answer: Should the effective altruism community grow faster or slower? And should it be broader, or narrower? https://80000hours.org/2020/02/anonymous-answers-effective-altruism-community-and-growth/ Mon, 17 Feb 2020 19:31:53 +0000 https://80000hours.org/?p=68472 The post Anonymous contributors answer: Should the effective altruism community grow faster or slower? And should it be broader, or narrower? appeared first on 80,000 Hours.

]]>

I want anyone in effective altruism to think “wow, this is such a great community to be part of — I think this is great!”, rather than feeling really ambivalent, or even stressed, or finding themselves often annoyed at other people.

Anonymous

The following are excerpts from interviews with people whose work we respect and whose answers we offered to publish without attribution. This means that these quotes don’t represent the views of 80,000 Hours, and indeed in some cases, individual pieces of advice explicitly contradict our own. Nonetheless, we think it’s valuable to showcase the range of views on difficult topics where reasonable people might disagree.

This entry is most likely to be of interest to people who are already aware of or involved with the effective altruism (EA) community.

But it’s the eleventh in this series of posts with anonymous answers — many of which are likely to be useful to everyone. You can find the complete collection here.

We’ve also released an audio version of some highlights of the series, which you can listen to here, or on the 80,000 Hours Podcast feed.

Did you just land on our site for the first time? After this you might like to read about 80,000 Hours’ key ideas.

In April 2019 we posted some anonymous career advice from someone who wasn’t able to go on the record with their opinions. It was well received, so we thought we’d try a second round, this time interviewing a larger number of people we think have had impressive careers so far.

It seems like a lot of successful people have interesting thoughts that they’d rather not share with their names attached, on sensitive and mundane topics alike, and for a variety of reasons. For example, they might be reluctant to share personal opinions if some readers would interpret them as “officially” representing their organizations.

As a result we think it’s valuable to provide a platform for people to share their ideas without attribution.

The other main goal is to showcase a diversity of opinions on these topics. This collection includes advice that members of the 80,000 Hours team disagree with (sometimes very strongly). But we think our readers need to keep in mind that reasonable people can disagree on many of these difficult questions.

We chose these interviewees because we admire their work. Many (but not all) share our views on the importance of the long-term future, and some work on problems we think are particularly important.

This advice was given during spoken interviews, usually without preparation, and transcribed by us. We have sometimes altered the tone or specific word choice of the original answers, and then checked that with the original speaker.

As always, we don’t think you should ever put much weight on any single piece of advice. The views of 80,000 Hours, and of our interviewees, will often turn out to be mistaken.

What’s your current view on effective altruism (EA) movement growth? Should it grow faster, or slower? Should it be broader, or narrower?

Effective altruism should be broad

We should be aiming for 100% of everybody. Everyone who is a philanthropist should know what effective altruism is — but almost no philanthropists I talk to outside of EA know what it is currently. They’ve never heard of 80,000 Hours.


I worry about EAs becoming disinterested in broadening the movement. I think the fact that EA Global was bigger four years ago than it is today is a mistake. I think the effective altruism community needs to be more than a few thousand talented passionate people in order to achieve its goals.

It should stay narrow

I think effective altruism should stay pretty narrow. I think it’s very hard to manage a very large movement. I think EA is still partly figuring out what it is. So, I think we should be aiming for something like the 90th percentile of altruistic people and 99th percentile of high-performing contributors to their fields or areas of study – that’s quite a narrow segment of society.

I think it would be good if we were growing a bit faster than we were. I feel like the optimal is steady, exponential growth — where steady means not slow to the point where people start feeling bored or unexcited. And I worry that we’re currently maybe at the low end of the good range.

I definitely don’t think we should be going for an all-out broad, environmentalist-esque mass movement.

It should focus on having a great culture first

Rather than growth, the thing I’d most want to alter is effective altruist culture. I want anyone in EA to think “wow, this is such a great community to be part of — I think this is great!”, rather than feeling really ambivalent, or even stressed, or finding themselves often annoyed at other people.

I think there’s stuff that could change that would make EA feel more like that first description. One way would be cultivating an attitude of celebration and welcomingness. I’ve heard people talking about how earning to give was ‘denigrated’ in EA, that no one should be getting career capital, things like that. And I think that’s evidence of both how ideas about what people ought to do spread in EA, and about how sensitive people are to them. So it gets very exaggerated.

Whereas if we could create a community atmosphere which is like “oh, you’re a school teacher who donates 10% of your income to AMF? We love you! Thank you for being in this world and doing so much good — I’m honoured to be in a community with you”.

I feel like that would be a healthier, less stressful community, where people made better decisions. Compared to the current situation, where people can feel that unless you’re working for a handful of organisations that are focused on short AI-timelines — you are basically worth nothing.

We do select for people who are very scrupulous, and anxious, but maybe that just means that we have to work much harder to counter those impressions. I mean, EA is just like “what’s the thing you can do that’s the optimal, most important thing?!” — it’s kind of a stressful idea. Perhaps we could do more to counteract that.


I think the ideal EA community would be a place where everyone really promising who finds it, likes it. I don’t think EA is that right now. That’s also a really high bar to hit, that’s hard. But that’s what I care about more than how fast EA is growing.


I think that the bad aspects of EA might be putting people off of these ideas by making them seem kind of cold, nerdy, and dismissive. So, broad vs. narrow isn’t how I’d think about this. I’d want to focus more on changing the tone of EA, putting out ideas in a way that seemed more helpful, and then I’d be much more likely to think it was good if it grew faster.


I want us to have somewhat better infrastructure for making people who are joining feel slightly held, rather than lost.

I think the issue of “it’s hard to get jobs in EA” is a sore point, and I worry it could be exacerbated further if growth continues without improving that situation. But if we have better things there, then growth seems great.

It should take a higher variance approach to recruitment

I’d be doing more to recruit people with more diverse mindsets.

I think I might take a higher variance approach. There are all these EAs who say “as soon as I heard about EA, I was instantly an EA”. So that’s great, you can get those people quickly and easily. You just need to make sure they’ve heard of the definition of effective altruism, and had the chance to get a copy of Famine, Affluence, and Morality.

And then there’s another group of people who are sympathetic to EA principles, but aren’t as naturally drawn. And they might have talents that seem really useful to the community.

So I might have a two-pronged strategy where I, i) focus on the easy cases, ii) make sure we pick up the people who aren’t as naturally drawn, but who have skills we really need.

I think you could have an approach that was less focused on medium-level engagement.

If you have one EA person who has a skill that’s really in demand in the community — get that person to work on recruiting more people like them. For people who aren’t immediately drawn to EA, it’s really useful to get the message from someone who is similar to them.

I don’t think natural EA enthusiasts — young, nerdy, philosopher types — are very good at convincing people who aren’t like that.

You could eventually have this system of a person convincing someone who’s a little bit farther, and then that person convincing someone else who’s a little farther away still — and emphasise to everyone along the way how valuable a role they can play in movement building and recruitment.

We should be wary of committing to one direction

I think we’ve got a problem here. The right thing to do given short AI timelines is not the right thing to do given longer AI timelines.

If AI timelines are short, then we probably shouldn’t be focused on becoming a mass movement. There are only so many things you can do at once, and some aspects of being a mass movement are competitive with short timelines. In particular, if almost every EA leader is working extremely hard on a short AI timeline scenario, then most public outreach is going to end up being quite deceptive. It’s probably not going to say “hey, this is a movement of people who think the world as we know it won’t exist in 10 years”.

If AI timelines are not short, EA should be focusing on becoming a mass movement. I think movements that don’t put a lot of energy into successfully handing down ideas to younger people, and transferring expertise — end up ceasing to exist.

But because we’re unsure, we end up being ambivalent on this question. And I think that’s bad in some ways, although it might be better than committing to one direction or the other.


I’m really happy about the switch that happened from expanding as quickly as possible, to expanding carefully in high-fidelity ways. I’m now thinking that maybe it needs to tweak slightly back in the other direction, but I’m not sure.

We should consider people who aren’t ready for a big commitment

I know some people have said, “all movements have a spectrum of involvement, and that’s going to happen with EA”. But it is possible to say “well, there are certain criteria that qualify you as an EA”. And then if you haven’t met those, you could say you’re an aspiring EA. If you haven’t quite donated 10% of your income or free time effectively, or moved to an effective career, etc.

I know some people think ‘effective altruism’ as a term is too presumptuous, and that everyone should call themselves “aspiring EAs”. I personally think that’s too modest. Because I think there are a lot of people who are doing really good work.

But the term might work for someone who says “I’m on board with the ideas, but I’m only donating 1%. I haven’t quite made the shift”. What I say to them is, there’s this whole field called diffusion of innovations. What they found is that when people change their minds, they may not change their behaviour for a year, or more. That’s normal. And we shouldn’t be frustrated with people who haven’t made this bigger shift yet.

This distinction would allow you to pull people in who might not be ready for a big commitment, without diluting the active EA community.

We should reach out to influential people

I think EAs should focus more on reaching out to people who already have influential positions.

We should want more exposure for both good and bad ideas

A big part of me intuitively thinks that, insofar as these ideas are correct, I want more people to encounter them. And even if they’re wrong, I want more people to encounter them — because that’s a good way of getting rid of bad ideas. I think it’s generally good to shine light on ideas. So that’s an argument for getting more exposure for the ideas, and then letting that affect growth one way or the other.

Enjoy this?

Sign up to our weekly newsletter to be notified about future entries in this series, and other new research:

Learn more

Other relevant articles

  1. Your career can help solve the world’s most pressing problems
  2. All the evidence-based advice we found on how to be successful in any job
  3. Find a high impact job on our job board
  4. Career advice I wish I’d been given when I was young

All entries in this series

  1. What’s good career advice you wouldn’t want to have your name on?
  2. How have you seen talented people fail in their work?
  3. What’s the thing people most overrate in their career?
  4. If you were at the start of your career again, what would you do differently this time?
  5. If you’re a talented young person how risk averse should you be?
  6. Among people trying to improve the world, what are the bad habits you see most often?
  7. What mistakes do people most often make when deciding what work to do?
  8. What’s one way to be successful you don’t think people talk about enough?
  9. How honest & candid should high-profile people really be?
  10. What’s some underrated general life advice?
  11. Should the effective altruism community grow faster or slower? And should it be broader, or narrower?
  12. What are the biggest flaws of 80,000 Hours?
  13. What are the biggest flaws of the effective altruism community?
  14. How should the effective altruism community think about diversity?
  15. Are there any myths that you feel obligated to support publicly? And five other questions.

The post Anonymous contributors answer: Should the effective altruism community grow faster or slower? And should it be broader, or narrower? appeared first on 80,000 Hours.

]]>
Anonymous contributors answer: What’s some underrated general life advice? https://80000hours.org/2020/02/anonymous-answers-general-life-advice/ Thu, 13 Feb 2020 00:48:49 +0000 https://80000hours.org/?p=68450 The post Anonymous contributors answer: What’s some underrated general life advice? appeared first on 80,000 Hours.

]]>

When you’re at school, you can just spend hundreds of hours really getting to know someone, and you just never really get time for that as an adult. And so if you lose an old friend, then you’ve just lost this huge investment.

Anonymous

The following are excerpts from interviews with people whose work we respect and whose answers we offered to publish without attribution. This means that these quotes don’t represent the views of 80,000 Hours, and indeed in some cases, individual pieces of advice explicitly contradict our own. Nonetheless, we think it’s valuable to showcase the range of views on difficult topics where reasonable people might disagree.

The advice is particularly targeted at people whose approach to doing good aligns with the values of the effective altruism (EA) community, but we expect most of it is more broadly useful.

This is the tenth in this series of posts with anonymous answers. You can find the complete collection here.

We’ve also released an audio version of some highlights of the series, which you can listen to here, or on the 80,000 Hours Podcast feed.

Did you just land on our site for the first time? After this you might like to read about 80,000 Hours’ key ideas.

In April 2019 we posted some anonymous career advice from someone who wasn’t able to go on the record with their opinions. It was well received, so we thought we’d try a second round, this time interviewing a larger number of people we think have had impressive careers so far.

It seems like a lot of successful people have interesting thoughts that they’d rather not share with their names attached, on sensitive and mundane topics alike, and for a variety of reasons. For example, they might be reluctant to share personal opinions if some readers would interpret them as “officially” representing their organizations.

As a result we think it’s valuable to provide a platform for people to share their ideas without attribution.

The other main goal is to showcase a diversity of opinions on these topics. This collection includes advice that members of the 80,000 Hours team disagree with (sometimes very strongly). But we think our readers need to keep in mind that reasonable people can disagree on many of these difficult questions.

We chose these interviewees because we admire their work. Many (but not all) share our views on the importance of the long-term future, and some work on problems we think are particularly important.

This advice was given during spoken interviews, usually without preparation, and transcribed by us. We have sometimes altered the tone or specific word choice of the original answers, and then checked that with the original speaker.

As always, we don’t think you should ever put much weight on any single piece of advice. The views of 80,000 Hours, and of our interviewees, will often turn out to be mistaken.

What’s some underrated general life advice?

Maintain old friendships

Keep up old friendships — they’re worth a lot. As an adult, you don’t really form new friends in the way that you do as a kid. You can think about it as a time investment; when you’re at school, you can just spend hundreds of hours really getting to know someone, and you just never really get time for that as an adult. And so if you lose an old friend, then you’ve just lost this huge investment.

In particular, with old friends, you might sometimes feel distant from them, you might feel like you’re drifting apart — but you’ll often find that regression to the mean kicks in, and you end up being closer again. That just kind of happens over the course of many years. I’ve been really happy that I’ve taken the time to maintain old friendships.

Figure out how to spend money to save time

It’s worth doing quick dollar per hour calculations for things that could save you time. To my surprise, I found that many things make sense even if you value your time at order of magnitude 1 dollar per hour.

They include things like a dishwasher (you can get a portable or countertop version if you are in a single person household), having laundry in your apartment (you can get a combination washer dryer that uses a standard outlet and no vent-this actually saves you money even when you count the increase in rent due to the lost square footage), using voice recognition software (this is one of the biggest timesaving interventions), using a hands-free phone so you can do housework while on the phone, treadmill desk/exercise bike allowing reading/computer work, having a wireless headset so you can listen to podcasts while grooming, etc.

Find a supportive partner

I think one of the most important things is finding a supportive partner. I might not want to say that publicly. I think a lot of the things that you feel help you the most personally along the way you don’t want to emphasise, even if they were important to you. The reverse statement is perhaps even more important: don’t have an unsupportive partner. Surround yourself with supportive people, more generally, including mentors in your field.


We need to make sure we continue to make time for relationships. Finding a supportive partner is extremely beneficial. Maintaining the strength of that relationship is also hugely valuable — strengthening your relationship can be a great way to reduce your overall stress levels, and allow you to better focus on your work.

Set aside time to think about the big things, then just live your life

Every so often, I have one chunk of time that I spend reflecting on the big things — my overall life plan, what I should work on next — and the rest of the time I don’t really think about it. And I feel like that’s about as effective, if not more effective, as thinking about it all the time. You’re probably less likely to burn out if you’re not beating yourself up about absolutely everything you’re doing. It’s fine to take some time to think about the big things, then just live a normal life the rest of the time.

Have a plan for what to do if your mental health deteriorates

Learn to identify signs of mental health deterioration in yourself, and have a plan for what to do if you see it happening.

This might be something that you naturally start to do as you get older. When you’re young, maybe you just don’t notice these things. And then you eventually start to notice the signs that it’s getting worse, and seeing those signs becomes something you want to pass on to others.

You might have stages where you think, “okay, I am feeling basically level one sadness”. And you know roughly what you can cope with in this stage. But you also know the signs of level two sadness, and if you get there — maybe you know that you need to take time off from work and go into recovery mode. Or maybe you go to your doctor. Or call a friend. Or re-read a certain book that reminds you of lifestyle things you can do to make you happy.

I think people start to have these plans as they get older, and they find them really useful.

Consider an unconventional life

Having a generally unconventional life that’s highly customized to your values (in my case, effective altruist ones) and preferences is tough and people like to stress the risks, especially when talking to younger people, but if you can get away with it, it is just a really fulfilling and joyful way to live.

I think I’m much happier than most of my old friends, on many dimensions, and I feel I get to express more of myself and generally be more myself, and have such a rich sense of meaning and purpose which a lot of people my age seem to struggle to find. I think if I’d gone a more conventional route, a lot of the things that are most precious to me wouldn’t even be a part of my life.

I don’t know what the odds of success with this are, and I do know of some people who tried to do something less conventional and I think it was quite bad for them, so I can’t say I recommend it from a self-interested perspective. I’m just saying that at least for me and a few other people I know, it’s made our lives much better and more meaningful. There’s a lot to gain, as well as to lose.

Get plenty of sleep and exercise

Get enough sleep. Exercise regularly. Experiment a lot with different ways of boosting your happiness, and productivity throughout the day. Maybe that means getting lots of light, maybe that means drinking coffee with theanine, maybe that means chewing nicotine gum, maybe that means working in pomodoros — maybe not. The possible payoffs to these kinds of things are very high.


There are pieces of advice that people sometimes don’t give, that seem really useful. Like, “focus a lot on your sleep, it’s really important”, or “exercise regularly, it’ll make you feel better”.

Do whatever the optimal thing is

Especially within the effective altruism community, it’s incredibly valuable to have people who are willing to do whatever the optimal thing is. It’s probably 10x more valuable than someone who’s really fussy.

Be able to give and receive blunt feedback

Being able to receive blunt criticism and feedback is a very hard, but very valuable skill. And being able to deliver it in a way that makes the other person feel respected is also a very hard, but very valuable skill.

Spend less time reading the news

Dramatically limit time spent on learning nondurable things, like news and twitter. It’s not just the time spent, but also that people tend to actually get less informed about the big picture issues, like believing the world is becoming more violent or more unequal. One compromise I found is the BBC’s “The World This Week” that is just 20 minutes a week.

Editor’s note: You may enjoy these pieces on homicide and global income inequality from our friends at Our World In Data — two of dozens of articles they have tracking the state of the world and how it has changed over time.

Be honest with your yourself about relationships

If you’re asking yourself the question, “should I break up with my partner?” — the answer is probably yes. But it’s also not that bad if you stay with someone who you’re okay about.

Target beliefs rather than groups

Avoid making arguments that needlessly involve criticising groups that people are part of, because it’s often hard for people not to not see it as a personal attack.

For example, if I were talking about a religious belief I disagree with, I’d try to talk about that specific belief and not about the religion as a whole, since not all members of that group will necessarily share that belief.

It’s a bit like if you’re a member of an intellectual group, and other people say “here’s what these feminists think”, or “here are the beliefs of these liberals” — and you wish they’d just target a belief that they think is bad. Why go via the group? It just seems likely to antagonise people in a way that’s totally unnecessary.

Insofar as a large group of people have a belief that I disagree with, I feel pretty comfortable targeting that belief. It seems both more effective and more honest.

Carefully plan for retirement

I have a retirement spreadsheet. The biggest factor when you can retire is your return on investment.

The general rule of thumb is retiring when you have 25 times your current annual expenses (some people say income, but that can be very different from expenses if you donate or save a lot). If you are retiring earlier than the median 61, you could argue you need more to retire, but of course if you have a better return on investment, you could retire with less. But if you do retire with more money, then you can afford to invest it riskily, and continue to donate a lot in retirement.

I think retiring on 25 times consumption of about $50,000 for a four person family or $1.25 million and investing riskily with a backup plan (like getting a less desirable job) in case there is a global depression could make sense.

One should consider Social Security, possible inheritance, reduced expenditures because of owning your home (if you plan to do that), etc. In the US, you have to worry about inflating healthcare costs, but at least they have been a stable percent of GDP for the last 15 years or so.

Make conscious use of your self-knowledge

I think people underrate some very meta skill of having self-awareness and examining advice you get critically, with an eye to whether it’s relevant to you. Knowing “am I generally too risk-taking or risk-averse? Do past mistakes make it seem like I’m underconfident or overconfident? Am I likely to do too much of X or too little?” is like a neat trick for making vague advice more useful and less dangerous.

It’s been surprising to me how I’ve been able to accurately answer the question “how will I later think my current actions or behaviour is flawed?”, but I don’t adjust my behaviour automatically… I need to consciously ask the question, then adjust deliberately. I know some other people are similar. Sometimes you know a lot about yourself, you just need to remember to bring it into the foreground.

Don’t live in the centre of big cities

The most misguided common opinions among people I respect are: risk aversion in investing, not prioritizing positive impact, not valuing the long-term future, thinking that weather is important for happiness, buying a house in the centre of the city (because I think autonomous vehicles will lower real estate values, at least relative to ‘business as usual’), that it’s a good idea to live in the centre of major cities in NATO countries (I think this is misguided because of nuclear war risk), and various cognitive biases.

Be brave

If you can afford to, just be braver.

Have a sense of humour

Don’t take yourself too seriously.

Enjoy this?

Sign up to our weekly newsletter to be notified about future entries in this series, and other new research:

Learn more

Other relevant articles

  1. Your career can help solve the world’s most pressing problems
  2. All the evidence-based advice we found on how to be successful in any job
  3. Find a high impact job on our job board
  4. Career advice I wish I’d been given when I was young

All entries in this series

  1. What’s good career advice you wouldn’t want to have your name on?
  2. How have you seen talented people fail in their work?
  3. What’s the thing people most overrate in their career?
  4. If you were at the start of your career again, what would you do differently this time?
  5. If you’re a talented young person how risk averse should you be?
  6. Among people trying to improve the world, what are the bad habits you see most often?
  7. What mistakes do people most often make when deciding what work to do?
  8. What’s one way to be successful you don’t think people talk about enough?
  9. How honest & candid should high-profile people really be?
  10. What’s some underrated general life advice?
  11. Should the effective altruism community grow faster or slower? And should it be broader, or narrower?
  12. What are the biggest flaws of 80,000 Hours?
  13. What are the biggest flaws of the effective altruism community?
  14. How should the effective altruism community think about diversity?
  15. Are there any myths that you feel obligated to support publicly? And five other questions.

The post Anonymous contributors answer: What’s some underrated general life advice? appeared first on 80,000 Hours.

]]>
Anonymous contributors answer: How honest & candid should high-profile people really be? https://80000hours.org/2020/02/anon-answers-honesty/ Sun, 09 Feb 2020 21:13:53 +0000 https://80000hours.org/?p=68405 The post Anonymous contributors answer: How honest & candid should high-profile people really be? appeared first on 80,000 Hours.

]]>

I feel like I’ve observed a lot of situations at big companies, where you can see this person acting as though they’re in Game of Thrones. And you come back in two years and… they just haven’t gotten anywhere.

Anonymous

The following are excerpts from interviews with people whose work we respect and whose answers we offered to publish without attribution. This means that these quotes don’t represent the views of 80,000 Hours, and indeed in some cases, individual pieces of advice explicitly contradict our own. Nonetheless, we think it’s valuable to showcase the range of views on difficult topics where reasonable people might disagree.

The advice is particularly targeted at people whose approach to doing good aligns with the values of the effective altruism (EA) community, but we expect most of it is more broadly useful.

This is the ninth in this series of posts with anonymous answers. You can find the complete collection here.

We’ve also released an audio version of some highlights of the series, which you can listen to here, or on the 80,000 Hours Podcast feed.

Did you just land on our site for the first time? After this you might like to read about 80,000 Hours’ key ideas.

In April 2019 we posted some anonymous career advice from someone who wasn’t able to go on the record with their opinions. It was well received, so we thought we’d try a second round, this time interviewing a larger number of people we think have had impressive careers so far.

It seems like a lot of successful people have interesting thoughts that they’d rather not share with their names attached, on sensitive and mundane topics alike, and for a variety of reasons. For example, they might be reluctant to share personal opinions if some readers would interpret them as “officially” representing their organizations.

As a result we think it’s valuable to provide a platform for people to share their ideas without attribution.

The other main goal is to showcase a diversity of opinions on these topics. This collection includes advice that members of the 80,000 Hours team disagree with (sometimes very strongly). But we think our readers need to keep in mind that reasonable people can disagree on many of these difficult questions.

We chose these interviewees because we admire their work. Many (but not all) share our views on the importance of the long-term future, and some work on problems we think are particularly important.

This advice was given during spoken interviews, usually without preparation, and transcribed by us. We have sometimes altered the tone or specific word choice of the original answers, and then checked that with the original speaker.

As always, we don’t think you should ever put much weight on any single piece of advice. The views of 80,000 Hours, and of our interviewees, will often turn out to be mistaken.

How honest and candid do you think high-profile people ought to be?

The devious don’t win out

I think a lot of people naturally tend to assume that playing politics is going to get them a lot of benefits that I don’t think it’s going to get them, and that being honest is going to get them a lot of pain that I don’t think it’s going to get them.

I’m not an absolutist, I think there are times to not be honest. But I see way more people screwing up in the other direction.

I feel like I’ve observed a lot of situations at big companies, where you can see this person acting as though they’re in Game of Thrones. And you come back in two years, and they just haven’t gotten anywhere. And you kind of assumed that they were going to win the Game of Thrones, because that’s what happens on TV. You know, the devious liar usually gets at least some kind of temporary victory somewhere in the plot, otherwise what’s the point of that character? But I’ve been very underwhelmed with how far these people get in real life. And people who are just really good at what they do, who are really honest all the time — I’ve seen them do really well. I don’t think it’s an absolute by any means, there are certainly exceptions.

People who are dishonest are just way more obvious than they think they are. Everyone catches on to it really fast. They start suffering the costs really fast — faster than they probably think. And people who are honest become trusted, and that’s a big deal.

It’s one of these things where I wonder if entertainment screws up our expectations, because in real life things are just way more boring. People trust trust-worthy people, and being trusted is really valuable.

You can convey a message in many different ways

I think it’s really important that effective altruism be a community in which important, true things can be said. It would be pretty damaging if there were important things that the effective altruism community had to strongly discourage all its high profile people from saying — and so we’ve got to figure out how to create conditions where that’s not the case.

But I also encounter people who seem to be picking fights that they don’t need to pick — that aren’t important. If you’re a high-profile person, you should make sure that you’re standing up for things for good reasons, and not just because you’re a contrarian.

Sometimes people seem like they’re deliberately taking bigger social penalties than they had to. They could have said things in a way that didn’t upset people nearly as much, but they didn’t on principle. And I think that’s an immoral principle. Doing good sometimes means trying very hard to make sure that the most important ideas land as well as possible. You should be willing, on principle, to make sure that you’re trying to communicate in a way that doesn’t make people mad. That instead of trying to look clever, you’re just trying to improve our collective understanding of the world. And usually if you do that, you won’t face a lot of controversy.


I think very honest. In general I think being dishonest is a bad strategy.

But one thing I really want to distinguish is being honest vs. choosing your words carefully. I think different topics have different levels of sensitivity, and you can often convey a message in many different ways with a very different tone and connotation via a different choice of words. And sometimes, especially people who have more of a contrarian bent, will take one message that could be phrased in many ways and either deliberately phrase it in a contrarian way or phrase it in a way that is not being generous to the listener. And I think that’s actively bad.

And sometimes people will defend that, by saying “oh, I’m just being honest”. But speech is not just literal communication. When you say something, you’re doing a lot of things. You’re not just conveying literal meanings of words. And so, if you’re talking about a particularly sensitive topic, then being attuned to that, and taking the time to think about what sort of other messages you might convey and then choosing your words such that you convey the right connotation as well as the right literal meaning — I think that’s very important.


I’m generally in favour of increasing honesty where possible, but it doesn’t mean that you have to be blunt or offensive.

But I think there’s a virtue to not shading things too much based on who you’re talking to.

If you say a lie often enough, you can start to believe it

I generally think it matters more that we be honest in our approach compared to other people. Part of that is I feel some non-utilitarian value to being moral.

Some of the most costly things that can happen to a movement happen because of dishonesty.

I think that sometimes people come up with an idea that has some merit, but that they think is much more persuasive than they should.

I’ve seen people convince themselves of the craziest things, because that’s what they felt they had to keep saying to donors. And once they’d said it enough, it seemed like it was too psychologically costly for them to admit that they’d been exaggerating for such a long time. By the end they actually believed that a pretty ineffective intervention was the absolute most effective thing — even though it would have seemed ridiculous to them if they’d never had to advocate for it.

Advocates are really prone to this, because of the importance placed on confidence. If you’re going on television, or fundraising — you don’t feel able to say “well, I think there’s a 70% chance that this intervention is a good idea”. In the real world, for people to want to work to pass your law or donate millions of dollars, they want you to be certain. So there might be a value to acting this way sometimes, but there is this real risk that you’ll end up believing your own hype.

We’re social creatures

I think radical honesty is misconceived. Some people think “if only people just told the truth all the time everything would be fine”, but I don’t think that’s the case. People say “I don’t do small talk, let’s get straight to the issues” — but I think that overlooks that we’re complicated social creatures. We’re not just exchanging information, it’s a signal of cooperation and friendliness. In general, I think we should be really careful before we propose anything that seems to steamroll over the top of hundreds of thousands of years of highly nuanced social norms.

I think once you say something publicly, you’re saying it to yourself too. We often haven’t formed a view on a topic before we’re asked, but once we are we just say something — and whatever that is can become our “view”. It may not be true, but we’re now more likely to think it is.

Good reasons for not sharing information

I don’t think people should lie, but I think there are many good reasons for not sharing information.

The most common is sharing negative information about people or groups. If you know that someone is far less competent than they’re claiming to be, that’s extremely useful information — but can also be uniquely damaging to that person, and even to the dynamics of a community. The taboo that exists broadly against sharing this kind of negative information publicly exists for a reason.

One consequence is that as soon as you have a reputation for sharing information, people stop talking to you. They’ll self-censor before that information gets to you.


There’s a difference between lying and not being forthcoming. I don’t have a problem with not talking about things that might get you into trouble.


Honesty makes sense to me as a general principle, unless you have a really good reason to avoid it.

There are real consequences for being completely candid

I think openness is generally very good. The thing that makes me hesitate is that people can be so punished by being very open. It’s so much to ask.

I don’t want people to be deceptive about the way they think about things, I certainly don’t want them to lie. It rubs me the wrong way if you’re thinking about everything in a very PR sensitive way about how your views might impact on the image of something, because I think that generally leads to deception.

I think deception is really easy to pick up on. I think people know when you’re doing a PR spin, and at the same time people will punish you for being forthright if they don’t like what you have to say. It frustrates me sometimes that the same people who don’t like the kind of PR, super washed-out views, still heavily punish people who state views that they disagree with.

I know a lot of people who have been really harmed for even somewhat considered public statements they’ve made that have been controversial. And the harms are so great, that I can’t say “oh, yeah, high-profile people should be super forthright about everything that they believe”. Because people get sent threats and have their lives ruined for doing that.

And so you’re put in a dilemma while those norms exist; you want to support complete honesty, but you also want to live in a world where people don’t have to worry about their personal safety and welfare if they’re completely honest, which can happen for expressing opinions on a lot of different topics — animal rights, feminism, religion etc.

Openness is altruism

From the perspective of a not-so or not-yet successful person, they should want high-profile people to be really honest so that you can understand how they achieved their success.

So I think it would be altruistic if more high-profile people expressed their views more openly. But maybe there are reasons it would be costly to them, e.g. to tell the story honestly they’d have to bring up an awkward fact, or show support for an unpopular position, or tell a story that portrays them in a bad light.

Present things differently in different contexts, but don’t distort the facts

It makes sense to present things differently in different contexts. But I think it’s important that you’re telling the truth in a different way, as opposed to actually distorting the facts. Common sense morality, as well as the pragmatic views of “what if I get caught?” — all seem to align on the position that you should basically just be honest.

People shouldn’t be forced to discuss everything

I’m generally very in favour of honesty. I don’t like deception, at all. But it feels like people should be entitled to some privacy. As in, if they have views that they think of as controversial, and they don’t want to discuss them, I don’t think they should be forced to. For example, if you’re religious and you’re aware that not a lot of people around you are religious — I don’t think you should have to talk about that. Not everyone is entitled to know everything about your beliefs.

Spend your ‘weirdness points’ where it really counts

If in doubt, I think people should generally err on the side of being more diplomatic. It’s easy to get caught up in your bubble — to think that everyone else thinks the same way that you do — and then it’s easy to put your foot in your mouth. But most of the senior people who hold most of the power probably don’t think the same way you do. So if you’re going to be weird, try to spend your weirdness quotient where it really counts, and try to be as normal as possible the other times.

Honesty in the effective altruism world

I think people in effective altruism, including high profile people, are doing very well in terms of honesty. For cases I can think of in effective altruism where people haven’t adhered to the norms of transparency, of coordinating — those have gone very badly. People have been really upset, and it creates a bad reputation.

Admit mistakes publicly

They should be quite honest on pragmatic grounds. I don’t think lying is a successful long-term strategy — I don’t think you’ll get away with it.

I encourage people to admit mistakes publicly. It’ll actually show a level of confidence — it shows that you’re confident enough to admit you have flaws.

Enjoy this?

Sign up to our weekly newsletter to be notified about future entries in this series, and other new research:

Learn more

Other relevant articles

  1. Your career can help solve the world’s most pressing problems
  2. All the evidence-based advice we found on how to be successful in any job
  3. Find a high impact job on our job board
  4. Career advice I wish I’d been given when I was young

All entries in this series

  1. What’s good career advice you wouldn’t want to have your name on?
  2. How have you seen talented people fail in their work?
  3. What’s the thing people most overrate in their career?
  4. If you were at the start of your career again, what would you do differently this time?
  5. If you’re a talented young person how risk averse should you be?
  6. Among people trying to improve the world, what are the bad habits you see most often?
  7. What mistakes do people most often make when deciding what work to do?
  8. What’s one way to be successful you don’t think people talk about enough?
  9. How honest & candid should high-profile people really be?
  10. What’s some underrated general life advice?
  11. Should the effective altruism community grow faster or slower? And should it be broader, or narrower?
  12. What are the biggest flaws of 80,000 Hours?
  13. What are the biggest flaws of the effective altruism community?
  14. How should the effective altruism community think about diversity?
  15. Are there any myths that you feel obligated to support publicly? And five other questions.

The post Anonymous contributors answer: How honest & candid should high-profile people really be? appeared first on 80,000 Hours.

]]>
Anonymous contributors answer: What’s one way to be successful you don’t think people talk about enough? https://80000hours.org/2020/01/anon-answers-one-way-successful/ Tue, 28 Jan 2020 18:52:04 +0000 https://80000hours.org/?p=68350 The post Anonymous contributors answer: What’s one way to be successful you don’t think people talk about enough? appeared first on 80,000 Hours.

]]>

It’s just so valuable to have someone who’s totally on board with the mission of the organisation they’re working for, and happy to do just absolutely whatever matters.

Anonymous

The following are excerpts from interviews with people whose work we respect and whose answers we offered to publish without attribution. This means that these quotes don’t represent the views of 80,000 Hours, and indeed in some cases, individual pieces of advice explicitly contradict our own. Nonetheless, we think it’s valuable to showcase the range of views on difficult topics where reasonable people might disagree.

The advice is particularly targeted at people whose approach to doing good aligns with the values of the effective altruism (EA) community, but we expect most of it is more broadly useful.

This is the eighth in this series of posts with anonymous answers. You can find the complete collection here.

We’ve also released an audio version of some highlights of the series, which you can listen to here, or on the 80,000 Hours Podcast feed.

Did you just land on our site for the first time? After this you might like to read about 80,000 Hours’ key ideas.

In April 2019 we posted some anonymous career advice from someone who wasn’t able to go on the record with their opinions. It was well received, so we thought we’d try a second round, this time interviewing a larger number of people we think have had impressive careers so far.

It seems like a lot of successful people have interesting thoughts that they’d rather not share with their names attached, on sensitive and mundane topics alike, and for a variety of reasons. For example, they might be reluctant to share personal opinions if some readers would interpret them as “officially” representing their organizations.

As a result we think it’s valuable to provide a platform for people to share their ideas without attribution.

The other main goal is to showcase a diversity of opinions on these topics. This collection includes advice that members of the 80,000 Hours team disagree with (sometimes very strongly). But we think our readers need to keep in mind that reasonable people can disagree on many of these difficult questions.

We chose these interviewees because we admire their work. Many (but not all) share our views on the importance of the long-term future, and some work on problems we think are particularly important.

This advice was given during spoken interviews, usually without preparation, and transcribed by us. We have sometimes altered the tone or specific word choice of the original answers, and then checked that with the original speaker.

As always, we don’t think you should ever put much weight on any single piece of advice. The views of 80,000 Hours, and of our interviewees, will often turn out to be mistaken.

What’s one way to be successful you don’t think people talk about enough?

Take motivation seriously

Maybe the most useful advice I’ve ever had was someone telling me to think about my motivational fit with job options as opposed to personal fit.

Not being concerned with an outside list of “What skills does this position need?”, but instead “How do I feel about the prospect of working at this place for 8 hours a day for years?”

Take seriously what you actually feel motivated by.


To really enjoy what you’re doing — to be sure that you’ve found the right job. When you do this, people can tell — you’ll look like you’re enjoying your best life, and I think that’s inspiring for others.

Be willing to do whatever matters

It’s just so valuable to have someone who’s totally on board with the mission of the organisation they’re working for, and happy to do just absolutely whatever matters.

Because most people aren’t really like that.

When you are a manager, and you have to manage someone where you need to think not only about “what’s the best thing for this person to do with respect to the organisation?”, but also need to consider the idiosyncratic ways in which they’re kind of fussy — it massively increases management overhead, and can quite drastically decrease usefulness.

Whereas someone who thinks “I don’t really care about what title I have, I don’t really care what I do, I just want to advance the mission of the organisation” is just way more valuable. Often many times more valuable.

My guess is that this works better too in terms of advancing your career.

You start to learn skills on the things that are actually most valuable, and you just get the reputation of someone who really gets stuff done — no matter what that stuff is.

In effective altruism (EA), there’s a further consideration. If there’s any career capital that’s zero sum — titles can be like this — then altruistically speaking there’s just nothing to gain from you getting that over someone else.

Unless the zero sum thing is a way of rewarding genuine contributions. Most senior people in an organisation should get better titles. But if you’re not that senior, yet you get a better title anyway — you’ve just deflated the value of other people’s titles. Maybe it’s net-harmful — it’s certainly not net-positive.

Be relentless at pitching ideas

Try to get the least important role at the place you most admire — then try to move up the ladder. Whenever you get the chance, be relentless at pitching ideas. Even if the ideas are terrible, it displays motivation and ambition, and it’s always easier for bosses to hire from within the company.

Improve the performance of others

Being successful in roles that involve supporting others: we probably don’t talk about this enough. At the same time, I think it’s really important that the person wants to be in a support role. Because I don’t want it to be the case that people start talking about how great it is to have people in support roles, even if they don’t want to be there.

And I think those roles are in fact very skilled. I’ve known amazing people who were personal assistants, or who are doing administrative work for organisations, and I don’t want it to be seen as some second-rate thing. They’re really important roles, and they’re probably undervalued, but only go into it if you’re good at it, and you’d enjoy it.


One certainly would be helping to improve the performance of someone else, I’m sure that’ll be significantly undersupplied because it’s not something that a community tends to admire.


I’m a big fan of broadening the ways to be an EA. It’s mainly focused on direct work, and earning-to-give. In addition to effective volunteering, there’s this concept of enabling an EA. It’s sometimes talked about in the context of being a personal assistant to a high-impact person — buying more X person time.

But another way of doing that is by being a partner of an EA — and being happy to take on more responsibility in terms of running a house than you would have been otherwise, specifically to buy your partner more time to do high-impact work.1 This is particularly controversial if discussed in the context of a wife taking on more housework than her husband — it can ignite a whole discussion of gender roles etc. But it really is about comparative advantage. It isn’t discussed much, but it can be important.

Become a fundraiser

In EA, we never really talk about fundraising as a career path. If you’re a good people-person, and are fine with asking for money — you could add huge value as a good fundraiser. It’s often easier to convince other people with money to donate to high-impact causes, rather than to make the money yourself.

Have important conversations with friends

Getting someone else to do good things. It could quite easily be the case that most of the social impact in your life comes down to a single conversation with someone you just happened to be friends with, who ends up becoming extremely wealthy or extremely influential, and you’re then able to connect them to the right people. And that’s pretty weird — the entire value of your career can end up coming from who you happen to know. That can be kind of disheartening, but I think it can be true.


I would include talking to your friends about EA as part of effective volunteering.

Do something really well that others in your community can’t

One thing is diversifying EA. When we tell everyone to take certain careers, there’s a risk that we’re going to really need a particular skill that we haven’t thought of yet. It might be extremely important to have people doing a wide variety of things; we might look back and think “thank God that there’s this one person who has this skill”.

A few years ago, I don’t think people anticipated the importance of having people who are good at information security, communications, operations, management, and executive skills.

A lot of times just being good at something allows you to pick up good management and executive skills that do transfer. So a lot of times when someone has managed a team for a long time, they add something valuable to EA.

In general, it’s really good to be able to do something really well that other EAs are not able to do. I’ve often wished that I could find an EA person who does X, where X is something that EAs don’t value a lot — but they don’t exist.

Do high-impact academic research

You really can do high-impact research as a professor. If you build the right relationships, you can publish a lot in conventional areas by being co-author on papers without that much effort, especially if you can help non-native English speakers with writing. Usually you will have a review before your official tenure review, perhaps after three or four years of starting as an assistant professor.

So you will have an idea whether you will get tenure, so you could try to switch to another university if things are not going well. And even if you don’t get tenure, you can generally get it at a lesser university. If there is some chance of value drift, then you want to prioritize impact in the near term. Perhaps later as EA issues attract more attention, there will be more professor positions actually related to the high impact topics, making switching universities easier and maybe even moving up in rank.

Take chances for generic success

If you have an opportunity to be successful in a fairly generic way, that can position you pretty well in the future, even if it doesn’t seem like the most effective thing now. Say that you wrote something really good, and someone wants to turn it into a book, and it seems like you could in fact do that really well — but you think, “will this book actually be useful, or effective?”

I think general success can lead to further success; it can create future options.

Establish close working relationships

Maybe long-term working relationships between more than one person — could be collaborations or partnerships, could be where one person is leading and another is supporting. Sometimes you have a vision, and someone else who’s helping to execute on that.

I think people think of individuals as a unit — something that moves around, and takes jobs. But there are some famous collaborators. Kahneman and Tversky are a good example.

Married couples have been collaborators too. There are various cases where one person is the public face of the work and gets all of the credit, and historically that’s been the man in a way we’d now rightfully be uncomfortable with. But I think there is a real potential upside in establishing a close working relationship.

Be patient when switching careers

Taking time out to make a career switch. It’s not discussed a lot, it can be tricky, but having advice on precisely how to do that would be valuable.

Teach yourself

For people doing their undergrad, identify and get a handle on broad, useful tools. I can never over-recommend probability and statistics. You don’t even need to do a course in, say, informal logic — you could just order a bunch of books off Amazon and teach yourself.

Sometimes it’s okay to neglect some classes at school in order to focus on more important skills that you’re teaching yourself. You’re going to get worse grades for some subjects, but ultimately the skills you build will likely be worth it — often it’s really valuable to build them early.

Actually talk to people

Talk with people. I’m always amazed how often people who have experience are willing to give you five minutes, and that’s usually worth more than anything you’re likely to find online. If you want to go to grad school for a specific subject, try to find someone who went to grad school for that subject. (But try not to be disappointed if they don’t have time!)

Consider earning to give

I’m not crazy about earning-to-give careers — but I think if you have really strong quant skills, joining a hedge fund with a view to eventually donating millions of dollars is probably underappreciated if your cause area needs extra funding, or at least a more diverse funding base.

Develop good people skills

Frankly, I’ve seen a lot of people get ahead by having good people skills, even in academia. There are loads of smart people who want to go into academia. Being smart isn’t enough. Smart plus good social skills is one way to tip the balance. Another is good time management, but we do talk about that one quite a bit.

Just do things

Actually just do the thing you’re interested in. If you’re interested in ML, start building. If you want to work on policy, start writing about it.

Enjoy this?

Sign up to our weekly newsletter to be notified about future entries in this series, and other new research:

Learn more

Other relevant articles

  1. Your career can help solve the world’s most pressing problems
  2. All the evidence-based advice we found on how to be successful in any job
  3. Find a high impact job on our job board
  4. Career advice I wish I’d been given when I was young

All entries in this series

  1. What’s good career advice you wouldn’t want to have your name on?
  2. How have you seen talented people fail in their work?
  3. What’s the thing people most overrate in their career?
  4. If you were at the start of your career again, what would you do differently this time?
  5. If you’re a talented young person how risk averse should you be?
  6. Among people trying to improve the world, what are the bad habits you see most often?
  7. What mistakes do people most often make when deciding what work to do?
  8. What’s one way to be successful you don’t think people talk about enough?
  9. How honest & candid should high-profile people really be?
  10. What’s some underrated general life advice?
  11. Should the effective altruism community grow faster or slower? And should it be broader, or narrower?
  12. What are the biggest flaws of 80,000 Hours?
  13. What are the biggest flaws of the effective altruism community?
  14. How should the effective altruism community think about diversity?
  15. Are there any myths that you feel obligated to support publicly? And five other questions.

The post Anonymous contributors answer: What’s one way to be successful you don’t think people talk about enough? appeared first on 80,000 Hours.

]]>