Social media services have been blamed for many things. In my way of thinking, it is important to different the privacy/manipulation of what is viewed issue from the inappropriate content issue. Section 230 is most directly about the second issue.
The problem with inappropriate content is that the government and individuals want to have someone to blame when inappropriate content appears. It is easy to blame the social media companies as they are there and associated with what folks see. However, the source of the inappropriate content is not the social media company, but the individuals who have posted the content. Presently, these individuals are difficult to identify and this anonymity allowing freedom from legal responsibility is the core problem. It seems there are two possible remedies; a) a system that only allows content posts from authenticated users or b) some system of moderation.
I assume both systems are necessary. Anonymous posts allow individuals who believe they must remain anonymous to post without identification. A spouse hiding from someone intended them harm is a common example. Individuals persecuted by a government is another. Potentially, systems could be created such that the social host service could still require authentication, but not reveal this information to the public or to governments. There are reasons not to trust this option as perfect, but many cases could be covered in this manner.
Moderation implies that a representative of the social service reviews content before it is posted to the public. The size of this task in some companies and the willingness of companies to take on this responsibility are legitimate barriers. This would seem a task with a predictable failure rate that means it would require funding beyond the labor involved to cover legal fees and payments in the cases a judgement goes against the company. I see two options – perhaps the cost of moderation could be paid by users. This would require prepayment that would be drawn to compensate reviewers. The second alternative would be for the companies to make a reasonable financial allotment for moderation and the content requiring moderation would be addressed at the pace these individuals could get to the material.
Are these perfect solutions and would the solutions allow equitable access? No and not completely. Speed of presentation would be an issue. Addressing the funding issue would be a great job and service creator. Companies wanting to offer authentication services could be created. Moderators could be hired.
Authentication as a service exists. For example, any of us wanting to fly must take the responsibility for authentication. Participation in social media may appear trivial in comparison, but it seems likely some action will be mandated. To repeat something I keep saying – don’t expect free.