Say Metadata Five Times Fast: Dequency Update 2/25

Crack open your notebook and bust out your gelliest gel pens, because class is in session for this Friday’s installment of weekly Dequency updates. When we’re not getting meta about data, we’re posting on twitter, dodging bots and scammers in our DMs on discord, and occasionally talking to candidates for our open roles.

For this week’s update, we wanted to give a quick peek into a couple of things that’ve been getting a lot of attention in our Discord and elsewhere: (1) metadata, and how it relates to (2) search functionality. To offer a bit more context, we’ll explain what metadata is, how it impacts search features, and exactly why this is crucial to the experience of using Dequency.

What is metadata (and what does it have to do with vibes)?

Metadata is data that describes other data. In the context of music, a song’s metadata includes basic information like song name, artist name, genre, tempo, etc… In searchable catalogs (i.e any place where users can browse a collection of music), platforms will often expand that basic metadata framework to include unique identifiers like mood or vibe — these super specific and descriptive pieces of data make the catalog more searchable, make music more discoverable, and ultimately are meant to enhance user experience. All music platforms hosting searchable catalogs thus rely upon an internal metadata system to categorize songs and appropriately surface them to users through search and recommendations.

When the metadata gets too descriptive…

Why is metadata important?

Metadata is crucial for many reasons — for example, it can inform complex legal processes like royalty tracking and ownership verification, and can thus be a deciding factor in whether an artist gets paid for the use of their work. If you wanna get your money right, get your metadata right. More relevant to the user of a platform like Spotify or Dequency, however, is the impact metadata has on search features.

Because metadata is the foundation upon which search features are constructed, inaccurate or insufficient metadata can make it difficult for users to find the music they’re looking for.

This is particularly consequential for a music licensing platform like Dequency. As emphasized in our Lite Paper, the ability for users to efficiently discover relevant music is key, otherwise we’re failing in our objective to provide a frictionless service for those seeking sync licenses. Imagine trying to find the perfect song to sync to your motion graphic NFT, and your search for “upbeat” returned no results… an operating search functionality requires a robust, complete, and descriptive library of metadata.

How does Dequency use metadata?

Any search and discovery features we build must satisfy the needs of both musicians and users of Dequency (visual creators, community members, web3 explorers, etc.). Musicians want control over the system’s categorization of their music, and users don’t want song identifiers (like tags) to be messy or overly subjective. This means that there must be a balance between standard identifiers that are not modifiable — such as genre(s), tempo, vocal information, etc. — and custom identifiers — such as mood, vibe, etc — that artists can assign as they see fit. Although, one might argue that “mood” is, in fact, a vibe… I digress.

My search for “vibey” actually worked… sending a note to our eng team rn

In the system we’re building, when a musician/rightsowner uploads a song, there will be standardized metadata fields, like genre and sub-genre, for example, where the artist will select from pre-existing, industry standard options (Pop, Electronic, etc.). However, there will also be customizable fields where the artist will be able to describe the vibe of their songs more personally — think tags like “Epic”, “Uplifting”, or “a str8 banger”. The goal here is to maintain consistency using common industry identifiers, but also to allow musicians to categorize their music in the way they **feel** it is best represented.

I don’t appreciate the fact that I searched for “sad” in Dequency and RUSL simply said “you’ll be fine”.

So what’s next?

Metadata is a complex topic, and the info here is very high level, but as we build out the Dequency platform we will dive deeper and explain more specifically how decisions around metadata and search affect the Dequency user experience.

While our beta includes basic search and tag functionality, we will be constantly enhancing these features in future updates. Our goal is to build search and recommendation features that are frictionless and intelligent, so that users of Dequency are able to quickly and easily find great music.

If you have suggestions for metadata methodology, or simply feel strongly about a genre that should be included, come on in to our discord channel #ideas-and-feedback and sound off.

Thanks for reading (& see you next week!)

-Team Dequency

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store