Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EA anchors too hard on existing orgs/ideas/strategies #2

Open
paul-crowe opened this issue Oct 11, 2022 · 15 comments
Open

EA anchors too hard on existing orgs/ideas/strategies #2

paul-crowe opened this issue Oct 11, 2022 · 15 comments

Comments

@paul-crowe
Copy link
Owner

No description provided.

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 11, 2022

Example: Lack of productive competition between orgs

Summary: "To encourage this, I'd love to see more support for individuals doing great projects who are better suited to the flexibility of doing work independently of any organization, or who otherwise don't fit a hole in an organization."

Date: Feb 8th 2017

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 11, 2022

Example: Only 'whitelisted' activities/goals are really EA

Summary: If it isn't on the shortlist of approved effective activities, it's a waste of time. Examples of whitelisted things; working at an EA-branded organization, or working directly on AI safety.

Date: 7th Feb 2017

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:
In general there seems to be a lot of acknowledgement of this pressure, but also a good deal of pushback:

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 11, 2022

Example: EAs might not actually change their mind much about values and goals, or form new opinions

Summary: As listed

Date: 8th Feb 2017

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

This one doesn't seem to hold up, especially since the mass shift of focus to AI/longtermism. Examples of people who updated their values:

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 11, 2022

Example: Over-focused, over-confident, over-reliant

Summary:

  • Over focused: "Earning to give" in finance and software as the tried and true default route for EAs to take. Why not startups, he asks? They have better results.
  • Over confident: Too many claims not backed up by evidence/research. "Why haven’t more EAs signed up for a course on global security, or tried to understand how DARPA funds projects, or learned about third-world health?"
  • Over-reliant: Too much quantitive data, reducing everything to numbers, and disregarding everything else.

Date: 1st May 2014

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:
Over-focused:

Over-confident:

Over-reliant:

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 11, 2022

Example: Inconsistent Rigor / Standard of Evidence

Summary: "Effective altruists insist on extraordinary rigor in their charity recommendations—cf. for instance GiveWell’s work. Yet for many ancillary problems—donating now vs. later, choosing a career, and deciding how “meta” to go (between direct work, earning to give, doing advocacy, and donating to advocacy), to name a few—they seem happy to choose between the not-obviously-wrong alternatives based on intuition and gut feelings. "

Date: 12th Feb 2013

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:
It should be noted that doing anything to address this (presenting newbies with a prescribed list of 'approved' life paths) would just feed into the "EA is an overbearing cult" objection. Also, this accusation of a "follow your gut" attitude contradicts the claims of Only 'whitelisted' activities/goals are really EA.

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 11, 2022

Example: EA has a motivated reasoning problem

Summary: EA has a problem with motivated reasoning and emotional biases which impairs its truth-seeking powers.

Date: 14th Sept 2021

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 11, 2022

Example: EA makes implicit and mute assumptions

Summary: Looking at the underlying assumptions that create EA culture, and in turn create "intellectual blind spots", specifically relating to homogeny, heirarchy and intelligence.

Date: 15th May 2020

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:
Point: has there ever been a group which didn't make implicit and mute assumptions? Is this an "EA" issue or a "human being" issue?

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 11, 2022

Example: EA is overly hierarchical and top-down

Summary: "Cultural norms around intelligence keep diversification at bay. A leader’s position is assumed justified by his intelligence and an apprehension to appear dim, heightens the barrier to voicing fundamental criticism."

EA is driven by the notion of solving all the world's problems through the sheer power of intellect. This leads to a pecking order of smarts, which in turn leads to fear of criticising those on top, lest ye be considered dumb. Doubt = lack of understanding. Guru worship.

Date: 15th May 2020

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 11, 2022

Example: Longtermism and feedback loops

Summary: No way to tell how things are going, since the results won't be known for another 1000 years. Thus feedback tends to come from peers, increasing the risk of groupthink.

Date: 24th March 2022

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 13, 2022

Example: Needs qualitative research

Summary: Too much of a focus on numbers, which can allow mistakes to happen. Such as.

Date: -

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 14, 2022

Example: Lack of mentorship and guidance

Summary: Too many people going it alone. Nothing designed to increase group effectiveness.

Date: 2nd Jul 2017

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:
Note that the rate of posts on the "personal development" board has exploded since a few years ago

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 14, 2022

Example: Neglectedness may be a poor predictor of marginal impact

Summary: The assumption that more good can be done in areas not recieving a lot of attention could be misguided

Date: 9th November 2018

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 14, 2022

Example: EA is being slow to recognise its own limitations

Summary: "So EA is discovering the limits of the philosophy that underpins it (Rational Choice Theory). It's just slow. It could move much faster by rejecting it and adopting Effectual logic wholesale."

Date: 28 Apr 2022

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

@paul-crowe
Copy link
Owner Author

paul-crowe commented Oct 14, 2022

Example: OpenPhil made inflation worse

Summary: As listed

Date: 24th March 2022

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

@paul-crowe
Copy link
Owner Author

Example: Earning to give should have focused more on “entrepreneurship to give”

Summary: Entrepreneurship can offer a potentially higher reward than the tried-and-true path of earning to give as an employee

Date: 9th Aug 2022

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant