Leadership | Tools

Online Wellness: two tools for nonprofit professionals
The web can be a scary place. Two tools to help you focus on work from two of the worst elements of an online life: toxic comments and online ads. 

The web can be a scary place. Two tools to help you focus on work from two of the worst elements of an online life: toxic comments and online ads. 

Online Ads:

Advertising has fundamentally changed. Today’s ads get your attention without letting you know they are trying to sell you something. On average, there are 4-10 ads per each web page – banners, popups, videos, etc. Digital marketing experts estimate that most Americans are exposed to around 4,000 to 10,000 ads each day. 

Introducing Ublock Origin. uBlock Origin comes in many flavors for most browsers. As a nonprofit executive you are already mentally strained to deal with social challenges in our communities. You do not need to also deal with malicious subsconsious and intrusive advertizing. uBlock Origin is an efficient tool that will make your online experience more enjoyable. Below the screenshot shows a Forbes article where 37% of the page has been block as either an ad or tracking script. 

 

Toxic Comments:

Tune is a Chrome extension that helps people control the volume of the conversation they see online.  It lets you customize how much toxicity they want to see in comments across a number of popular platforms, including YouTube, Facebook, Twitter, Reddit, and Disqus.

 

Tune lets you turn the volume of toxic comments down for “zen mode” to skip comments completely, or turn it up to see everything—even the mean stuff. Or you can set the volume somewhere in between to customize the level of toxicity (e.g. attacks, insults, profanity, etc) you’re willing to see in comments. Tune builds on the same machine learning models that power Perspective (https://www.perspectiveapi.com) 

The machine learning powering Tune is experimental. It does still misses some toxic comments and incorrectly hides some non-toxic comments. Tune isn’t meant to be a solution for direct targets of harassment (for whom seeing direct threats can be vital for their safety), nor is Tune a solution for all toxicity. Rather, it’s an experiment to show people how machine learning technology can create new ways to empower people as they read discussions online.