For my team I led this challenge as the UX Interaction Designer.
For the theme of this project, we were given the challenge of solving a fear. Now this arose many questions and a lot of paths that we didn't want to go down like ones that related to mental problems.
Since we only had 24 hours to complete this project we had to work fast. Not
really knowing what fear was and trying to think outside the box we looked up the acronym for
F.E.A.R leading to the words of False Evidence Appearing Real.
The lightbulb burst. Finally, after two hours of constant project pitches, we came up
with a project that would resolve a fear and possible make a difference.
Our statement was, how might we organize information to understand the bias of various news mediums?
Research was easily the best part of this project.
It was time... to crunch some research. The time is 10 p.m and our research deadline is midnight all we could think about is things are about to get real. We set the snacks, cracked on the YouTube series Crash Course and we learned all about politics and government that we could.
Once we got all the information that was needed we landed an interviewed with a friend who was into cybersecurity and politics. The reason for this interview was to see if our idea would work with machine learning and if someone would want to use something like our app if it had a 99% success rate of being correct. Questions thrown left and right the team was doing our best trying to learn the most out of the topic that we could.
The time is around midnight and the tides have turned. We planned to crush out interactions and layout by 8 a.m. To save time and stress we jumped into crazy 8's but instead of using the method with 8 minutes on each interface set we stuck with making it two minutes to save time and stress. With our crazy 8's we wanted to keep the feel of a news app but allow for less clutter and information that wasn't relevant to what we wanted to show upfront.
The topics page is pulling articles about specific information and categorizing them by the dominance of what party they are. Red is identified as Conservative while blue is Liberal.
Once you select a topic or article from the topics page you are given a list of articles from other news companies that are again color-coded by Conservative and Liberal allowing you to view the sides as to why they are that way. In this example I am selecting "California Wildfires" it is then sending me to the articles page where I can view other topics that are being discussed about California Wildfires. We wanted to keep the conversation boxes staggered because we wanted you to get the feeling as if there was a conversation being brought up just like how text messaging acts.
The purpose of the facts page is to identify all of the key information that is constantly being brought up within all of the articles. The purpose of allowing this page is so you can make your own opinions and not create bias.
Once you have selected the article of your choice you are given a set of buzz words which relate to things that are being brought up a lot within the topic.
When you are in an article you can turn on or off the bias switch which brings up the areas that are being claimed as bias allowing you to get an understanding of why the article is the way it is. When you click on the bias area it gives you a description as to why the machine learning is stating that information as bias.
Overall this project was super fun, and in the end, we ended up winning the competition. Long hours with no sleep rattled how we were processing things but it allowed us to organize the needed information and not slip up with cluttered information. The purpose of Candor is to allow articles to be categorized in either Conservative or Liberal, then pull the information out of those articles that are constantly being brought up and put them into the facts page. This allows the user to create his own opinions and not become bias towards another thought. If you wanted to click an article the ability to educate yourself on the information and the bias that is being brought up is available.