World Tune

Combine photography + quirkiness to reinvent artistic creation.
For the curious, creatives, and explorers




Google X SCAD Pro 

Collaborative Project
Sept ‘22 - Nov ‘22


Topic of Work: UI and Interaction Design
Skills: UI Design, Visual Design, UX Research, Prototyping
Tools: Figma, llustrator, Arduino

Overview: The team was tasked to build upon Pixel's cutting-edge computational photography features to design ingenious and delightful content creation experiences


Challenge: The Google Pixel team approached SCAD to design a quirky, Pixel-exclusive app that’ll transform the way Gen Z experiences, records, and expresses life's moments.

Solution: The end deliverable was four proof of concepts presented to the Google Pixel team.

Highlights: I led both primary research as well as UI /Interaction for our application.




Process

Understand It

Understanding the problem/opportunity space before primary research
Google came to the team with the challenge of “designing a quirky, Pixel-exclusive app that’ll transform the way Gen Z experiences, records, and expresses life's moments”


Think It


Primary and secondary research +  key insights learned

This experience gave me a great opportunity to lead and teach the team the process of UX research. How to create a survey, interview questions, how to affinitize, and create HMW statements. Together as a team we conducted research on the creative process of GenZ creatives




Insights:



Build It


Conceptulization and initial user testing
We adopted multiple ideation strategies to have the upmost variety and impact. We went through crazy 8s, co-creation workshops, and mind mapping to spew out as many ideas we could think of. My team  was particullary drawn to the insights we want our stories to leave a mark and that there is no limitation to inspiration.  We started brainstorming our idea  World Tune:

- Our phones are now extensions of ourselves need to go beyond visuals
-  Worldtune offers a sensory representation of the users experience by generating sounds that match their moment, and inducing haptic feedback to amplify the emotion through AI technology 

To validate our concept we created two initial testings for users to confirm if our idea had potential:



Test 1
What: Simulating the AI sound generator by collaborating with a sound designer who demo’d a sound from the users preferences in music relating it to images they’d taken. 
Impact: This influenced how our concept moved forward in all aspects including editing control, tune alternatives, and the make up of the sounds.



Test 2
What: Testing the use of haptics was to blindfold users and have them touch and feel different textures and assign them a genre of music such as pop, indie, or reggaeton.
Impact:
We were able to categorize and understand the connection people make between texture and sound.

Initial Wireframes:

With the first round of UI, the team and I wanted a playful approach and try to implement uncommon interactions to try and engage our uses.

User Feedback
  • Love the playfulness
  • Structured grid would be easier to comprehend
  • Emphasize hierarchy



Tweak It


Round Two:
Our second round had a better brand identity and style. The team came together with our first round of feedback to make sure we had a more cohessive mockup to present

User Feedback
  • Too much “organicness” when it comes to the gallery
  • Reduce button size 
  • Allow more customization to the buttons
  • No need for drop/inner shadows







For more questions, thoughts, comments, or just general grievings : ) email me @ gargipantdesign@gmail.com