Google Drive Finder Plugin

I use the Google Drive macOS Finder Plugin. In order to save local storage space, I don't sync all of my folders to my computer. Unfortunately this means if I'm looking for something I can't find, I have to look inside my preferences pane to see if the file might be in an unsynced the folder.

A better system would be to show the unsynced items within finder, so I could explore the directories, and only download what I need. The unsynced folders and files could be shown in grey, and a right-click could give me an option to sync it to my local device. 

Preferences Pane

Unsynced folders are shown in the preferences pane. This is great functionality, but it's not very accessible. (read: user-hostile) 

In addition to being hidden away, this pane could use some additional functionality. It ought to show files, as well -- again, going back to my use case where I'm looking for a file I haven't synced.

Finder Window

Unsynced files / folders should appear in the finder window.

Screen Shot 2017-07-11 at 11.59.00 AM.png

Sync (download) remote files / folders

Here's a masterfully crafted doodle of a right-click menu interface that allows for the download of offline (greyed out) items. 

Meetup Week Calendar

When looking for local meetups and events, you have to consider your own schedule. Work, school, business meetings -- life often gets in the way of your networking efforts.

To help users see which events fit their schedule, Meetup.com could deploy an improved calendar view. Week view calendars visualize time of day vertically, and day of the week horizontally. This is perfect for seeing which events fit within your available blocks of time.

Concept sketch

[Fig. 1] Sketch made with the Google Keep mobile app

From screenshots I built a block-frame design (Fig. 2.3), as well as a blank slate (Fig. 2.4) to use as the canvas for the calendar interface.

[Fig. 2]

I adopted patterns from Google Calendar's week view, where the seven columns for each day are the same width, and the far left column for time [hh:mm] is just a bit smaller.

[Fig. 3]

The initial mockup contained a light color pallet that was difficult to see, so I darkened the colors to add visual contrast. [fig. 4.1] 

I also removed the deselected groups [fig. 4.2], since it's more useful to only show groups the user has selected for viewing.

[Fig. 4]

[Fig. 5]

Genius Chords Viewer

You're having a jam session with your friends. When you look up song chords, every search result comes up with ugly, non-mobile-friendly sites.

This is a plea for Genius.com to help bring song chords into the 21st century. 

Given their strong community of contributors, it'd be reasonable to imagine that some users would be willing to contribute chords, so others can learn how to play the songs.

Here's a prototype of this idea, along with some related diagrams.

I tried to find a toggle on their website, to conform to their current design, but I came up empty handed. I decided to pick Google's Material Design toggle slider, as the Genius design somewhat resembles Material Design with its flat, block-oriented UI elements, as well as its bright colors. (The purple used in this post is similar to the pastel purple used on Genius.com) 

Rebuilding the Google Material Design toggle switch.

I would've liked to have built an animated .gif of this animation, but this diagram will do for now. Basically the chords (purple) ought to slide out from underneath the lyrics (grey), with an "ease in" behavior for both opacity and position.

Animation overview.

In addition to the animation helping the users to make sense of the relationships between the chords and lyrics, whitespace helps to separate chord/lyric pairs from one another, to reduce confusion.

Block spacing.

Checklist "Share" Interface

To make it easier to share music with friends, I recommend Spotify take some hints from Snapchat's checklist interface, which - in my opinion - is part of the reason snapchat is so engaging; because the sending mechanism is so simple and powerful. 

I'll walk through my UX design process workflow in this post, but if you'd like to skip to the completed designs, click here.

Overview of prototype workflow, combining and synthesizing existing elements to make new elements.

Overview of prototype workflow, combining and synthesizing existing elements to make new elements.

First I outlined screenshots of Snapchat. The checklist interface is incredibly simple, but I thought I'd build an outline of it just for fun. 

Formal Analysis of Snapchat checklist.

After analyzing the visual architecture of the Snapchat interface, I outlined Spotify's "send" window. This helps me preserve the padding  width between the edge of the screen and the window.

Spotify Mobile Share Window

To make these outlines, I use the rectangle tool, bright colors, and low transparency so I can see the background and the overlay shape. Once I get the shapes pixel-perfect, I add a high-contrast stroke to better visualize the shapes.

Page navigation showing the user path to the share menu.

After revisiting the design, I made some minor changes to improve the visual continuity and overall prettiness of the checklist. I also added an extra "share to my feed" checkbox, to merge the functionality of the above "post to..." & "share to..." options all into one single interface, which requires fewer taps by the user. 

A few iterations of the checklist design. #2 is simplified and cleaner, #3 adds a scrollbar on the right edge.

Please let me know what you think of this design in the comment section below, I'd love to hear your feedback and I'd be willing to make changes and credit the changes to you on the blog. Iterative design is about constantly improvements, and I'd like to incorporate that into this blog as well. 

Feel free to connect with me on Twitter, LinkedIn, and Angel List.

Spotify Notification Drawer Player

In an attempt to reclaim some wasted space, I redesigned some of the UI elements in the Spotify Player in the Android Notification Drawer. The album art is larger, unnecessary text has been removed, and playback buttons are closer together to make them easier to reach with one-handed operation (sorry lefties -- maybe there can be a left-hand mode with the buttons on the left)

Hopefully this design looks a bit more balanced, let me know what you think in the comment box below!

Process

  • Quickly sketch idea in my journal while sitting at a stop light on my morning commute.
  • Take screenshot of the Spotify player. (again, while on my commute)
  • After work, take another screenshot so I don't have Google Maps navigating in the background.
  • Take photo of journal page.
  • Import photo into Affinity project.
  • Make crude sketch in Affinity. (Fig. 1.)
  • Build blockframe by placing transparent rectangles on the screenshot, then colorizing them. (Fig. 2.)
  • Copy the original blockframe & start prototyping the new design. (Above)
Fig. 1. Digital wireframe sketch with markups in red.

Fig. 1. Digital wireframe sketch with markups in red.

Fig. 2. Iterative blockframe design using transparent objects for placement.

Site Preview in Search Results

This is a user interface mockup for a search result page that previews fully interactive mobile pages, in aim to improve the speed at which users can access relevant information.

It's not perfect, and perhaps it addresses a need that's not there. But with a few tweaks, this could be A/B tested to validate its usefulness. Just an idea.

Heres a blockframe comparison between a typical search results page interface, and the proposed interface. This is useful to weigh the pros and cons of each, for example, the left can show more results in a single viewport than the right. However, the right allows people to look over the site instead of having to visit each individually.

Initial concept sketched in my journal.

Photo of my Affinity Artboard, which somewhat illustrates my workflow, as I work from top to bottom.

Screen Shot 2017-04-05 at 7.25.20 PM.jpg

Workspace

A vector illustration of my typical workspace. Coffee, Macbook Pro, Wacom Tablet, and my 7th Moleskine Grid Journal.

For the curious, here's a blown-up view of the journal sketches. I drew up a montage of diagrams from recent pages of my current journal. 

For the really curious, I'll explain what a few of the diagrams mean:

[1] Midway down the page on the far left is a camera pointed at what appears to be a river. This is a concept for using surveillance cameras and computer vision technologies to analyze environmental changes, such as river flow rate, flood patterns, changes in foliage, weather patterns, etc. I hope to write a blog post about that soon.

[2] At the bottom of the left page is a few hierarchy diagrams. This is loosely based on an idea I have to map out power structures, political power, economic power, etc. The left diagram illustrates a highly centralized system, with the majority of the power being held by a single entity. On the right theres's loose central oversight of highly distributed powerful entities. The inequality operator is there to show my personal preference towards decentralization. 

[3] Middle-right is a nested diagram based on an idea to show local governance and representatives based on one's location. A visual guide of who's in power could help increase civic participation, as we'd know exactly whose door to go knocking on when things aren't done right.

[4] Bottom-right is a diagram of my optimal workspace location. Basically a desk right next to a window overlooking a dense city block. The vibrant movements of cities offer creative inspiration to me, and if I could just look down and see the city, that would be incredible. Gotta hustle to get there, I suppose.

The scribbles in the middle and on the right edge are what I do to get my ballpoint pens working.  I usually do this no matter if my pen's been acting up, since I'd prefer scribbles than a gross-looking attempt to write text or draw something. 

UX Projects:

User Interface / User Experience Design is what I usually what I'm working on with the above workspace setup. Here are some of those projects!

Improved Rating System

This is a concept for a rating system centered around a collapsible hierarchical list, so users can rate sub-items within categories. This enables users to add nuanced opinions without writing long reviews, which saves time for everyone. Less time stressing about writing a beautifully written review without typos, and fewer internal man-hours parsing poorly written reviews.

Restaurant Rating Example

Comparison between a written review and a multi-level rating system.

Not only is the image on the right easier for humans to read (given a few UI tweaks and different visualizations) but it's also easier for computers to understand. This makes data analysis much easier, since there's no need to extract nuance from written reviews. 

This concept could be deployed effectively for nearly all rating systems. App Store app ratings, Google Maps location ratings, Amazon product ratings, LinkedIn job ratings, etc. 

Suggested applications from left to right: Google Play Store, Apple App Store, Google Maps, Apple Maps, Amazon, Netflix.


I spent some time looking for the best user interface for nested lists and hierarchies. I used the macOS style for my mockup because it's simple and clean, but if you know of any prettier interfaces, please drop me a line in the comments below! 

View this article on Medium.

Google Project Sunroof

Google's Project Sunroof is a tool that estimates solar potential of buildings by analyzing 3D models of buildings from Google Maps, as well as other factors. I've compiled some feature requests that could improve the service.

On this page:
- Dark Mode

Dark Background

With the current color scheme, the highest value data point is also the least visible. Yellow intuitively signifies solar potential, but the light grey background makes yellow very difficult to see.

Here's a mockup of the data with a dark blue background, which makes the data much easier to see. Built with Google's own map style maker, I used the following parameters: roads = 100%, landmarks = 0%, labels = 0%, color = "Aubergine"

If you're aware of any similar tools designed to estimate solar energy potential, please link to them in the comments below. I'll try to add them to AlternativeTo.com, my favorite site for finding alternatives to specific apps and services.  

Glassdoor

Here are some feature requests and interface mockups for glassdoor.com, a site designed to assist job-seekers by making companies and salaries more transparent.

On this page:
- Search Histogram
- Wage Gap Analysis

Salaries > Search Results > Show Histogram

Replacing the horizontal bar with a histogram serves both an aesthetic and pragmatic purpose. Histograms are prettier, universally understood, and allow the user to recognize patterns. (i.e. entry level positions in your city are significantly lower than the national average entry level position, but the mean salary is comparable to the national average)

User interface mockup: 1) show distribution histogram of submitted salaries, 2) reveal salary with cursor hover.

Companies & Reviews > Company Info > Wage Gap

Transparency is good. Sometimes transparency seems bad, but that's only when it reveals the bad stuff hiding beneath the surface. To read about the benefits and dangers of increased wage transparency, check out this article about Buffer's transparency initiative.

This feature, if it could be properly executed, would illustrate the wage gap between the lowest paid worker and the highest paid executives. Submitted salaries (green)  would be compared to the estimated pay of the highest paid executives based on variables including (1) yearly earnings of the company (this field could be crowdsourced, for users who want to do some digging online) and (2) CEO-worker pay gap in respective country, state, or other district where this data is readily available.

Spotify

Show Album Artwork

Feature request for Spotify to show Album Artwork in the list view, to improve usability. This would allow users to quickly scroll through a list to find specific songs based on the color of the album, instead of trying to speed-read song titles. 

Related links: Twitter thread with links to the Spotify Community request, where pretty much all of the comments are bashing Spotify for not having this feature.

Squarespace

Feature requests for Squarespace, my current website builder.

On this page:
- Suggest Blog Thumbnails
- Number of  Available Pages
- Scrolling: Summary Carousel Block
- Lightbox Filmstrip

Blog Editor

Populate optional thumbnails from image blocks in the "Content" pane.


Show Available Pages

To help users see how close they are to running into their subscription's page number limits, there should be a small quota tally at the bottom of the page structure editor. 


Two Finger Scroll

In the carousel view of the summary block, there are two left/right arrows on the top right corner (1) used to navigate to the next/previous pages. To improve usability, the carousel block should respond to intuitive horizontal scrollwheel events such as 1) trackpad two-finger scrolls and 2) touch screen single finger scroll events. 

Basically, make the carousel block actually behave like a carousel. 

Lightbox Filmstrip

Anchor Links

On the subject of anchor links (the links that scroll to different positions on the page) I have two (2) suggestions.

1) I'd like to see Squarespace adopt an automatic anchor link system for all header text similar to what's used on Github. When the user hovers their cursor over the header, a "link" icon could appear.

2) my second suggestion 

Affinity Designer

Feature requests for Affinity Designer, my vector illustrator of choice at the time of this writing. On this page:
Open Recent
Quick Preview

Also, here's a twitter search of tweet from @jamiefeedback to @MacAffinity 

Menu > File > Open Recent > [ ... ]

The following are requests for the "Open Recent" section under the "File" menu, to help improve the user's ability to find recent projects. 

File > Open Recent > Show Image Preview

File > Open Recent > Show Image Preview

File > Open Recent > Search & Scroll

File > Open Recent > Search & Scroll

Quick Preview

In macOS you can preview a file by selecting it and hitting the spacebar. Unfortunately, Affinity files are not optimized to utilize the typical screenspace used by other image filetypes such as .jpg, .png, .psd, etc. Here is a comparison of the preview screen area. 

.affinity file Preview

.jpg, .png, .psd file preview

Google Maps

Feature requests for Google Maps, to help us improve our understanding of our planet.

On this page:
Typical Traffic Graphs
- 3D on Mobile
- Snapshot of ideas


Typical Traffic Graphs

For those of us trying to avoid traffic, a "congestion timeline" would be very valuable. Google already has a "Typical traffic" slider to illustrate which roads are usually the worst, but it doesn't show a timeline. Here's a user interface prototype I've made to illustrate this idea.

Diagram 1: Bar graphs illustrate traffic data per weekday, and per hour given the selected day.

Diagram 1: Bar graphs illustrate traffic data per weekday, and per hour given the selected day.

As the user scrolls and zooms to view different areas, the graphs should automatically update, with smooth "ease in" bezier animations.

EDIT: as of June 2017, Google now provides graphs to indicate how long a specific trip might take depending on the time of driving (i.e. longer during rush hour, etc)

I've annotated a screenshot of this new feature below, where I indicate the need for more granularity, as well as the ability to more fully explore the dataset. (all 24 hours of the day, instead of the nearest 3 hours) 


Support for 3D imagery on Mobile


Snapshot of ideas

I look forward to publishing more feature requests for Google Maps and mapping technologies in general. Here's a collapsed screenshot of my feature requests ( [fr] ) for Google Maps. 

Twitter

Some feature requests I've come up with for Twitter over the years.

Pinch to zoom out

A feature request for all platforms, both mobile and desktop to improve the user experience of looking through tweeted photos.

Screen Shot 2016-12-25 at 10.13.14 PM.jpg

I will continue to populate this page with diagrams of my feature requests for Twitter, but until I do so, here's a collapsed list of the various ideas I have for them. (some of these might not make sense on their own, but they'll make sense when I publish them)

Screenshot of my Workflowy note of my Twitter feature requests (that's what [fr] stands for, this tag helps me quickly search for feature requests in my notes)

Orange Data Mining

Feature Requests

Compilation of feature requests for the Orange Data Mining, a popular open-source data analysis & visualization tool. 

Live Preview: When adding a new widget, show a live preview of data when you hover your cursor over different widget types, to help the user quickly look through all the possible visualizations.

Custom gradient color picker for modules that use color gradients to render data visualizations. The custom color picker could be nested inside the pulldown list of preset gradients.

View this feature request as an open issue on GitHub

View this feature request as an open issue on GitHub

Usability request: scroll actions should adjust the view pane of the Hierarchical Clustering module. 

Song Deconstruction

When you listen to music, you can usually recognize individual instruments. The well-trained ears of musicians can identify complex aspects of musical composition: time signature, key, acoustic effects, etc. 

Music is just vibrations. Squiggles on a line graph of [air pressure] over [time]. All known characteristics of music are borne of the human mind's aptitude for recognizing patterns.

In the same way that computer vision is positioned to soon surpass human vision, we can develop "computer hearing" software to allow us to understand music in new ways. Specifically, an interface to visualize the instrumental building blocks of songs, to reveal the science behind the magic of music. 

Instrument Recognition

First, our analysis model must be able to identify instruments. Artificial neural networks could be trained to 'guess' which instrument made a sound, based on their training data.

Graphical representation of an instrument-recognizing probabilistic function. < % instrument probability > ex: Piano, Guitar, etc. 

One way we could train these networks would be to let the program identify patterns on its own, isolate the sounds, and ask us humans to classify the sound with text. This could be a crowdsourced effort, similar to how Google trains some of its neural networks with its "Crowdsource" app. 

instrument_name = input() asks the user for an instrument label, so the network can learn the names of the different sounds.

Song Reconstruction

After we have a program that can recognize instruments, the next engineering hurdle will be to improve the ability for the program to recreate the song. A "turing complete" robotic musician -- one that makes music that sounds like a human created it. 

Some may decry this as a dystopian "death of creativity", ushering in a future where musicians are put out of business by robots. This doesn't have to be the case. 

The purpose of the reconstruction process is to improve the ability to interpret and replicate nuance. Once that nuance is quantified, it can be visualized for us humans to make even more complex and creative musical works. 

Instrument detection and reconstruction will output musician-friendly MIDI tracks that can be imported into professional music production software.

It would probably be practical to deploy different neural networks to detect different types of patterns, since there are different scopes of patterns ranging from high-level rhythms and melodies, and low-level patterns such as specific sounds and effects. 

Different networks could be deployed to identify different pattern-levels. Screenshot taken from Workflowy, graphics made in Affinity Designer.

Let's build this. Feel free to promote your GitHub project in the comment section below.

Suggested resources: 

  • TensorFlow  python library for building and training neural networks. 
  • TensorBoard for visualizing data from TensorFlow.
  • if you find any related open-source projects online, share them with me and I'll put them in this list. 

Check out the video explanation of this idea, the outline on Workflowy, and feel free to join the discussion on Reddit and Twitter.

 

Crowdsourced Water Tests

Water is vital to life. The crisis in Flint, MI has caused many people to question the quality of their drinking water, as well as the government oversight of such public utilities.

The government does conduct studies to monitor drinking water quality, but nobody would disagree that we'd be better off if average citizens could double-check those results.

Just a map.

In short, the end-all solution to this problem is quite simply a map. Crowdsourced, powered by the people, and any other groups that want to contribute their data. Get advanced water testing devices in the hands of millions of communities by reducing costs and ease-of-use, and let those communities contribute to the map.

The data produced by the amateur test kits will certainly not be of the highest quality, but as the number of data points increases, the more reliable the data becomes -- this correlation is described by the Law of Large Numbers theorem.

Gamify Public Health

An effective way to get people to contribute to this study is to offer them rewards. No, you don't have to have a ton of money to hand out to them; give them psychological rewards -- the reward of feeling like they're helping their community. Similar to how Google Maps gamifies maps contributions, this app could show the user graphs of their contributions over time, stats about how many contaminants they've detected, making them feel like the local hero for discovering hazards, and helping make the community a safer place.

Not only is the map the final resting place for the test data, but it can be used to help communities see which areas have out-of-date data, or no data at all, so they can work to fill in the gaps on the map. 

Visualization

It's important that this data is actionable; something you can take to your town hall or administrative body to say "here's where the problem is, please fix it." 

Simple data visualization can help amateurs identify the likely location of a contaminant. These visualizations can be created by connecting the dots and filling in the gaps, a mathematical process called interpolation. 

These visualizations are used to identify the probable location of a contaminant, as well as areas that need testing (2a: "missing data")

These visualizations are used to identify the probable location of a contaminant, as well as areas that need testing (2a: "missing data")

To even further improve accuracy, the data (1b) can be "wrapped" to fit publicly available schematics (1a) of water utility lines, drainage sewers, and hydrological flood studies. This would help place the data points within a logical context, answering questions like "what's upstream?" and "what's downstream?"

One method of achieving this visualization would be "Surface fitting" (3a) which is a type of interpolation for 3D curves.

Now what?

Let's build it.

If you've got the skills and you'd like to take a stab at this, feel free to send your github project link in the comments at the bottom of this page. (click here if you don't see the comment box) With your permission, I'll promote your project in the body of the article.

Check out the video explanation of this idea, the outline on Workflowy, and feel free to join the discussion on Reddit (#1, #2) and Twitter.

3D Clouds

Imagine a weather app that lets you see detailed 3D models of the massive cloud structures above; the elegant wisps of cirrus, the bulky storm clouds as they roll across the sky… we have the technology to this, we just haven’t done it yet.

Perspective sketch of a cumulonimbus storm cloud above a city.

Perspective sketch of a cumulonimbus storm cloud above a city.

The concept is simple: a network of cameras pointed at the sky, spread far apart from each other to capture different perspectives, and we use software to combine all the images into a live interactive 3D model.

Illustration of ground-based camera array using geometry to calculate three-dimensional cloud positions.

Illustration of ground-based camera array using geometry to calculate three-dimensional cloud positions.

Deployment

In terms of hardware, “Whole Sky Cameras” (wiki) could be used to capture hemispheric images of everything above the horizon. A live stream would then be setup, (similar to WU’s weather webcam system) and location data would be added to the webcam.

Initially, a company wanting to pioneer this idea could blanket one specific area with webcam coverage, as a proof of concept to showcase the 3D models. As the network grows, the site would automatically merge other people’s webcams into the global crowdsourced model.

Final Thoughts

Awe-inspiring sculptures of colorful sunsets, meandering tornados, and billowing volcanic eruptions — That’s what we can capture with this technology.

Not only would this satisfy the curious mind, this tech can also be valuable for research, atmospheric studies (meteorological prediction improvements, fluid dynamics) flight simulation models, etc.

If you'd like to contribute to this idea, feel free to share your github project in the comment section below (click here if you don't see the comment box) and with your permission I'll embed it in this article. 

Check out the video explanation of this idea, the outline in Workflowy, and join the discussion on Reddit and Twitter.

Mockup of 3D cloud imagery over manhattan island.

View this article on Medium.

Self-Aware Roads

Roads are constantly falling apart. That’s just what happens. Sometimes the roads in greatest need of repair are neglected for long periods of time, but we can change that. We can use our collective computing power to help us prioritize which roads need to be fixed first.

The idea is to build a mobile application that records vibrations while you drive, and anonymously tracks your location to help build a crowdsourced map of how bumpy the roads are.

Vibrations due to cracks, uneven pavement, potholes — all recorded by the device’s accelerometer.

Vibrations due to cracks, uneven pavement, potholes — all recorded by the device’s accelerometer.

Given three example vibration datasets, the algorithm (statistical regression, probably, idk I’m not a data scientist) is able to determine which vibrations are road-related and which are anomalies, such as someone fumbling their phone around.

Comparing datasets against one another increases the quality of the roadmap, automatically identifying user errors.

Comparing datasets against one another increases the quality of the roadmap, automatically identifying user errors.

Read about the Law of Large Numbers for a mathematical explanation of the relationship between number of data inputs & accuracy.

Check out the video explanation of this idea, the outline in Workflowy, and feel free to join the discussion on Reddit (link 1link 2) and Twitter