“Social Dilemma” is opening the eyes of Netflix Bingers around the world and transforming digital lives. Filmmakers explore its impact on social media and society, raising some important points about how to take advantage of user data related to mental health, politics and various ways. It exchanges interviews with industry executives and developers who discuss how social sites can integrate the human psyche in order to engage and spend more time within the platform.
Despite the obvious problems on social media platforms, people still want digital attention, especially during an epidemic, where personal communication puts pressure if not impossible.
So, how can the industry be improved? Here are three ways to use social media to create happy and healthy interactions and news consumption.
On most platforms, such as Facebook And on Instagram, the company determines some of the information presented to users. It opens up a platform for manipulation by bad actors and raises questions about who is correctly ordering what information is visible and what is not. What are the motives behind these decisions? And while some platforms are at odds with their role in the process, Mark Zuckerberg said in 2019, “I just strongly believe that Facebook should not be the mediator of the truth of everything that people say online. ۔ “
Censorship can be removed from organizational type social platforms. For example, consider a platform that does not rely on advertising dollars. If a social platform is free for basic users but is monetized through a purchase model, there is no need to use information-gathering algorithms to find out what news and content users need Is offered.
This type of platform is not a clear target for manipulation because users only see information from people they know and trust, not advertisers or random third parties. Manipulation on major social channels often occurs when people create zombie accounts on flood content with fake “likes” and “opinions” to affect what they see. This is usually exposed as a tactic of electoral interference, where agents use social media to promote false statements. This type of action is a fundamental flaw in social algorithms that use AI to make decisions about what promotes listening and when to censor.
Don’t treat products like consumers
Issues raised by “social skepticism” should reinforce the need for social platforms to self-regulate and ethically manage their content and user dynamics. They should look at their most manipulative technologies that have caused loneliness, depression and other issues and instead look for ways to promote society, progress and other positive qualities.
One of the major changes needed to bring this about is to eliminate or reduce advertising on the platform. An ad-free model means that the platform does not need to aggressively push illegal content from immovable sources. When ads are the primary driver for a platform, the social company is interested in using every psychological and algorithmic trick to keep the user on the platform. It’s a game that benefits consumers.
More and more people on the site are equivalent to displaying and engaging the ad and that means revenue. An ad-free model frees a platform from trying to generate emotional reactions based on the user’s past actions, perhaps to the point of trapping them all on the site, perhaps to the point of addiction.
Encourage contacts without click-bytes
A common form of click bait is found on the most common type of social search page. The user clicks on an image or preview video that suggests a specific type of content, but when clicked they are redirected to unrelated content. This is a technique that can be used to spread misuse, which is especially dangerous for viewers who rely on social platforms to consume their news rather than traditional outlets. According to the Pew Research Center, 55% of adults find their news “often” or “sometimes” on social media. This is a major problem when important articles make it easy to present distorted “fake news” stories.
Unfortunately, when users engage in click-byte content, they effectively “vote” for that information. The seemingly impure act creates a financial reason to create and spread more clickbacks for others. Social media platforms should aggressively ban or restrict click bytes. When it comes to blocking clickbacks, management at Facebook and other firms often argue for “free speech.” However, they should consider the intention not to act as censors who are blocking controversial topics but protecting consumers from inappropriate content. It’s about promoting trust and information exchange, when post content is much easier to achieve by supporting facts.
“Social Suspicion” is rightly an important film that encourages an important dialogue about the role of social media and social platforms in everyday life. The industry needs to change to create more engaging and realistic places for people to connect without falling victim to the human psyche.
A long order, but one that benefits both users and the platform in the long run. Still forms important digital connections and functions as a catalyst for positive media and positive discussion. Now is the time for platforms to take note of and take responsibility for these necessary changes, and for smaller, emerging platforms, opportunities will arise that will take a different, less manipulative approach.