Josh Waring

ITS 360 (sec 001)

Assignment #9 (Due 4/17/15)

What Would You Do?

1)      I would explain to my manager that I did not feel comfortable using my personal social media to promote job-related topics.  I feel like it would be deceptive to promote the restaurant, not only because I personally don’t think it is all that great, but also because if I didn’t disclose that I was an employee at the restaurant it would also be bordering on illegal.

2)      I would tell them that they can feel free to check my social media pages/profiles, as I make sure not to post anything derogatory or inappropriate.  I would tell him that I don’t participate in any illegal activities, so he definitely would not find evidence of that on any of my social media platforms.  Lastly, I would offer him access to view any social media site of mine that he wanted to.

Critical Thinking Q’s

1.1)  I believe that most social media sights are not doing enough to shut off the communication and promotion of terrorist groups.  I think that sights such as Facebook and Twitter need to employ a larger content policing department, and they should have clear, concise guidelines for identifying and dealing with potential terrorist content/communications.  However, I believe that these social media platforms would have to be careful not to violate the privacy of its users, as users expect privacy as well as efforts to reduce terrorist use of social media platforms.

1.2) I believe that US anti-terrorism laws should take precedence over the CDA, as I believe the intent of the CDA was not to promote the use of social media for terrorist groups, but to provide some release of liability for ISPs and social media companies regarding what the users use it for.  That said, I believe that terrorism shouldn’t fall into the protected area of the CDA, and that it should not be allowed in any form on social media.

1.3) The lawsuit by Pulse was dismissed in the spring of 2018, as the judge ruled that nothing the terrorist viewed on the social media platforms caused him to commit his act of terrorism.  As such, the judge ruled that the social media platforms couldn’t be held liable for the terrorist’s use of the sites, despite the fact that it was [easily] visible on the sites.  I disagree with the judge partially, because I feel that the social media sites should be held responsible for policing and removing potential terrorist posts/uses of their sites, and failure to do so that can be linked to a terrorist attack should result in legal action against these sites.  That being said, there has to be a link between the social media posts and the attack—not just speculation.

 

2.1)  I don’t think Youtube should take a more active stance in censoring its users.  Not only could this lead to legal issues for Youtube, as I believe they could then be liable under Title II of the DMCA, but I believe that it would lead to needless censoring of any video that doesn’t fit Youtube’s agenda.  In the recent Youtube ad boycott, many totally legitimate and family friendly channels suffered.  I know one Youtuber who lost nearly all of his ad revenue for no apparent reason—he was playing family-friendly videogames and never swore or talked about sensitive topics.  Youtube’s customer support is also notoriously bad, so unless they can fix that, I think they should focus more on fixing their current censoring and demonetizing algorithm/policies than they should on enforcing stricter guidelines. 

 

2.2) I think Google could work on developing a [better] AI to detect inappropriate and questionable content.  I think that if the AI detects potentially harmful content, it should only demonetize it if it is considered extremely offensive by the algorithm, and any other potentially harmful content it detects should first be vetted by a Youtube employee before it is demonetized.  I think Youtube also needs to provide a better appeal process for Youtubers who lose monetization for no valid reason.

2.3) I think Google/Youtube should provide a guarantee to advertisers that, while not perfect, they will do their best to prevent advertisements being placed on hateful or extremist content, and if there happens to be a case where this content goes undetected and is monetized, then Youtube should agree to repay the advertise a significant amount in reparations.