Cluttered search data

When users performed a search, the results were unorganized and looked more like a data dump than a refined snapshot. The new journal cards needed a visual redesign with information hierarchy that was clean, logical, and dynamic.

  • Dated interface designs

  • Content looked like it was floating and unorganized

  • Styling was inconsistent between elements


Reorganize the visual data

The overarching goal was to reorganize the information to better match our internal data entry process, which felt more logical than what the customer was seeing. This meant redesigning the visual layout for descriptions, stats, review and submission details, links, contacts, and journal history.

We needed to improve scanning and usability by fitting data into one expandable, screen-sized card to avoid information overload. This also meant reducing clutter in expanded cards by incorporating some progressive disclosure (only the info you need at the right time).

Lastly, we wanted to incorporate better help text for explaining metrics within the card itself in case there was any confusion about data.

  • Update data hierarchy and modernize journal results

  • New design for CCI metric to incorporate all disciplines

  • New design for Altmetrics metric

  • Predatory Reports cards


Steps of the process

Evolution from old designs

Each journal entry includes lots of information to display, from contact info to submission requirements. All of it needed to be organized in a clear manner and easily accessible. The old design (seen below) used an expandable card format, which felt appropriate, but the data was hard to make sense of quickly because of its chaotic organization. Colors, metrics, and other elements felt arbitrarily chosen and unsettled, leading to information overload for users.

Redesigning the journal entries

I first listed out all the data types and grouped them into categories, ranked by customer importance. Using this new hierarchy, I used gestalt principles of visual grouping to mockup a new card design. Overview data like titles and important metrics would be shown on the collapsed view for easy scanning. Contact info, metrics, and submission details would be separated into tabs within the expanded detail view.

I also created a table view for the entries for even quicker scanning, but because of past issues with data scraping and piracy, we settled on just the large card format. This was unfortunate because I believe this is how most customers would prefer to use our product.

Data states & extras

For each data point, I created visual states for every possibility, including null data. This meant labels, data, icons, colors, and tooltips.

Our in-house metrics and industry jargon can be hard to understand, so I set out to design some helpful guides. I created modal popups for each of the metrics, explaining how they were calculated and how to use them. I kept the language very simple and to the point. Tooltips, icons, and other standard help strategies were included throughout the site as well.

Metric for measuring discipline relevancy

The most difficult element to work on was our in-house metric, the Cabells Classification Index (CCI)—aggregated citation counts used to find the top journals within each academic discipline.

In the past, we had displayed the number alongside color-coded medals, radar charts, and list of percentages (all at the same time). It was hard to decipher what the data meant. The reason it was difficult to represent was because of the variation and usage of the data; a journal could have one discipline or as many as 17, without any indication of whether a score was good or bad relative to another.

After lots of bad options and failed designs, I settled on a coxcomb-type chart or aster plot, a variation on the pie chart and good choice for presenting information for those who are easily confused by the information. Each section is equally divided (unlike a pie chart), with the radius width of that section changing depending on the value. The slices are arranged around the circle in a spiral in descending order, making it easy to compare differences. Users could then quickly scan to see which discipline was performing best. To supplement the visual cues, a list of specific numbers could also be displayed next to the chart; topics could also be shown within each discipline for more specificity.

This was my solution for a progressive display of complex information. It is not perfect, but it works well as a catch-all solution. As an extra benefit, I was able to transform the nautilus shape into a product icon to be used in marketing materials.

Color coding journal disciplines

To further help users quickly scan journals, I color coded each of the disciplines (and added icons not show here). This wasn’t necessary, but it added an extra layer of subconscious pattern identification. Users could start to associate disciplines with specific colors, making it faster to identify their choices as they scan. I applied this to the CCI, journal icons, and anywhere else it made sense to do so.

Dynamic data

A static list worked fine for us in the past. Our database was not originally setup to show dynamic data because we partially relied on only one set of third-party data, which is sent to us in batches throughout the year. Our engineering team had to build a new way to load journals quickly with every type of data accurately shown. I created placeholder cards, or skeletons, to display while journals were loading. I also had to design charts to compare stats over time.

Finally, the journals needed to include space for future types of data to be added. Lots of planning went into the designs to make sure they grew in a way that made sense.

Predatory journal cards

In addition to cards for Journalytics, the launch of our new product Predatory Reports meant we needed to do the same process for predatory journals. We collected different types of data than our verified journals, so we needed a special variation of journal card. These focused more on the list of violations our team found during the investigation process.


  • Didn’t realize we would have so much incomplete data

  • Table view would undermine our business even though it would be useful (piracy)

  • Grading system for measuring journal quality was too hard to implement

  • Infinite scroll annoyed people because of slow loading and bad search logic


The biggest challenge was making compromises with my ideal designs vs the information architecture we had in place. At the time, we couldn’t restructure the data, improve the data flows, or alter the information too much from what customers and our executives were used to. The journal cards were the final result of many other moving parts that needed their own improvements before anything drastic could happen to search results.

Other challenges

  • Removed lots of features because of engineering difficulties (maps, historical timelines)

  • Dynamic data was hard to design for with both yearly data dumps as well as daily changes


I’m happy to say that the cards are still used today. We even used the card design as the foundation for the new Journalytics Medicine product being released in 2022.

The flexibility built into the designs worked out in the end when we had to swap a few metrics out along the way. Our customers come to us for our data, and this was the most important design project to get right. The success of these designs has allowed us to grow as a team, improve our product, and provide even more help in the research communities.

As one of our happy customers explained:

“The metrics! The explanations & the charts! The easy to use interface.”

Related projects