Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Excessively penalizing accounts for being blocked or reported #658

Open
setlightlyupon opened this issue Apr 1, 2023 · 9 comments
Open

Comments

@setlightlyupon
Copy link

Either you've experienced it or you're very lucky. Sometimes, even during civil conversation, someone will be in a bad mood and suddenly block you - even someone who you've regularly interacted with!

Please reduce the penalty for this. Also, shadowbanning and/or reducing account "reach" silently, without transparency, is wrong (but much appreciation for this major step towards transparency). Please show us exactly what we've done to be deboosted. That way, we can correct the behavior and remain allowed into the public square.

Please indicate which of our posts have been reported so that we can correct the behavior.

It's truly frustrating (and Orwellian) when a previously well-functioning account is suddenly extremely throttled. Please increase the speed at which an account can recover after being sent to the e-gulag.

<3 Musk and the new twitter. You're the good guys, and we all know it! You're saving democracy and so much more. Free speech is the bedrock of everything. "Freedom of speech, not freedom of reach" is the antithesis of free speech, however.

@syrusakbary
Copy link

I actually also thought of this, and was in the way of fixing it.

I think we should stop penalizing blocks or mutes long term, and we can do that by stablishing a window count limit for the block and mutes.

syrusakbary added a commit to syrusakbary/the-algorithm that referenced this issue Apr 1, 2023
@Xpenzz
Copy link

Xpenzz commented Apr 1, 2023

People that spam blocks & mutes should be penalized, not the other way around.

@setlightlyupon
Copy link
Author

People that spam blocks & mutes should be penalized, not the other way around.

Agreed! The current system means that someone aware of this, with flexible morals, can just block all their political enemies.

@khatharr
Copy link

khatharr commented Apr 1, 2023

Please reduce the penalty for this.

Sorry, but can you point to where in the source this "penalty" is issued?

@Sqaaakoi
Copy link

Sqaaakoi commented Apr 1, 2023

People that spam blocks & mutes should be penalized, not the other way around.

obviously you are not a minority in any form

@goonette

This comment was marked as abuse.

@darkdevildeath
Copy link

I believe that there should be a credibility scale for accounts in principle. The level of credibility should be visible at least to the next user of the account. The scale could consider:

  1. Account verification
  2. Positive vs negative engagement
  3. Whether the account is new or old
  4. Number of blocks and mutes
  5. Number of reports received
  6. Confirmed phone number
  7. Activity in Community Notes

This scale should be used in various mechanisms of Twitter, including the application of penalties for blocking and muting. Accounts with a higher score on the scale have a greater impact on a user's reach when that user is blocked. New accounts, without verification, with few followers, few likes, many reports, and little reputation in Community Notes would have almost no impact on the penalization of third parties.

@PaulNewton
Copy link

@PaulNewton
Copy link

I believe that there should be a credibility scale for accounts in principle

Same, I'd think it does need to be clearly contrasted in how it's not a defacto "social credit" score because that's the type of semantics people like to latch on to for tools like that (i.e. conflate low ranking ~ being banned ).
Though they would definitely no longer be wrong if other factors slowly creep in over time such as social status,class,sex,religion,regionality etc.

Alternative packaging: Curation (curation scale , score , or skill). Credibility is already very socially ambiguous , moreso once you mix in words like positive or negative in it's measurement. One can be perceived negatively and still have credibility and vice versa

Accounts with a higher score on the scale have a greater impact on a user's reach when that user is blocked.

That is a just another system of abuse in an attempt to prevent abuse that guarantees positions of power against dissent.
Minimum fix is blocked users reach is limited within the higher scored posters immediate reach/sphere for a time but that's still kinda vague.
While blocked users shouldn't be able to platform themselves through abuse, platformed users definitely shouldn't be able to build moats of abuse.

Number of blocks and mutes

Just to clarify that should not be penalized against an account for how many other accounts they themselves block.
Just how many others have blocks them.

Order Suggestion and splitting into categories
Verification metrics (real world)

  1. Account verification
  2. Confirmed phone number
  3. Whether the account is new or old

Behavior metrics (usage)

  1. Activity in Community Notes
  2. Positive vs negative engagement against the account by others, by followers
  3. Positive vs negative engagement against the accounts of others, by followers

Penalties

  1. Number of blocks and mutes
  2. Number of reports received
  3. Cool off period since peak numbers of 1# & 2#

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

11 participants