Jakob Höflich visited #Tokyo last week as part of a German music industry delegation, meeting with labels, publishers, and music tech companies🇯🇵 We summarized the learnings from EMEE Japan Music Market Study 2024 and conversations with Sony Music Japan representatives. (Some things might surprise you!) Many thanks to the German Chamber of Commerce and Industry in Japan (AHK Japan) and Hamburg Music Business e.V. for this opportunity and Takayuki Suzuki for the EMEE study 🙌 Tell us in the comments which fact stood out for you!
Cyanite
Softwareentwicklung
Berlin, Berlin 4.862 Follower:innen
AI that transforms chaotic music catalogs into discoverable, valuable song libraries.
Info
Cyanite is the AI music intelligence engine that helps publishers, platforms, and brands turn chaotic catalogs into discoverable, valuable music libraries. So every song gets a fair chance to be found. At Cyanite, we help our partners: 🎯 Tag catalogs with genre, mood, energy, lyrics & more 🔍 Search using Free Text Prompts or Sound-Based Similarity 🛠️ Offer intuitive search UX via API integrations 💡 Deliver better briefs, faster pitches, and stronger metadata 🤝 Move from gut feeling to data-backed decisions 📊 Where we’re at • 40M+ songs tagged with Cyanite • Trusted by 200+ companies globally • 200,000+ users on our free Web App (artists, producers & more) • Used by clients like Warner Chappell, RTL, APM Music, Epidemic Sound, and BMG 💡 What drives us Our AI listens to sound, not popularity. No black boxes, no bias. Just a fairer, more transparent way to surface the right music for the right moment. Beyond commercial impact, we also support educational and non-profit innovation through Cyanite for Innovators because we believe music tech should also serve culture, not just capital.
- Website
-
https://cyanite.ai/
Externer Link zu Cyanite
- Branche
- Softwareentwicklung
- Größe
- 11–50 Beschäftigte
- Hauptsitz
- Berlin, Berlin
- Art
- Kapitalgesellschaft (AG, GmbH, UG etc.)
- Gegründet
- 2019
- Spezialgebiete
- Machine Learning, Deep Learning, Artificial Intelligence, Music Information Retrieval, Data Science, Music Industry, API, SaaS, AI und Music
Orte
-
Primär
Wegbeschreibung
Berlin, Berlin 10961, DE
-
Wegbeschreibung
Mannheim, Baden-Württemberg 68239, DE
Beschäftigte von Cyanite
Updates
-
Cyanite hat dies direkt geteilt
𝗔𝗜 isn’t just changing how we make music, it’s changing how we connect with it. I recently saw that first-hand while helping a label organize their catalog with Cyanite.ai. 𝗔𝗜 is reshaping how we discover, classify, and monetize music and Cyanite.ai is one of the most practical examples of that transformation. I recently helped a small record label run a pilot test using Cyanite’s 𝗔𝗜-powered API, and the results were amazing: ▶️ All the tracks were automatically tagged by mood, genre, and instrumentation in just a few hours. ▶️ The similarity search uncovered hidden gems that matched current sync opportunities. ▶️ Their CMS integrated the data seamlessly, no endless spreadsheets, no manual cleanup. Cyanite’s API turns AI from a buzzword into a workflow: It listens, understands, and structures your catalog with precision. In today’s industry, metadata is more than information, it’s intelligence. 𝗔𝗜 tools like Cyanite don’t just help us organize music, they help us make better creative and business decisions. ⏭️ If you work with labels, sync libraries, or digital archives, I highly recommend exploring Cyanite’s API. It’s one of the most forward thinking applications of 𝗔𝗜 in music metadata today. #AItools #cyanite #musicmetadata #musicindustry https://cyanite.ai/
-
Who's responsible for labeling AI-generated music? 🤔 Great question from Cherie Hu that the industry needs to answer. At ISMIR Conference 2025, we saw breakthrough research on detecting AI-generated music - identifying subtle "fingerprints" left in the audio signal. The technology exists and it's getting better. But detection is only half the equation. The harder questions: → Should platforms mandate AI disclosure labels? → Who enforces it - DSPs, labels, distributors, artists themselves? → What about hybrid workflows where humans and AI collaborate? → What happens when detection systems make mistakes? That last point matters more than you might think. Research by Laura Cros Vila, Bob L. T. Sturm, Luca Casini, and David Dalmazzo, PhD presented at ISMIR shows that even advanced detection can mislabel human-made tracks as AI-generated - potentially causing platforms to remove real human art while trying to filter AI content. Our take: Transparency matters. Listeners and music professionals deserve to know what they're working with. But the "how" - implementation, enforcement, and protecting artists from false positives - is still being figured out by the industry. What's your perspective - who should be responsible for labeling AI music? Credit: Cherie Hu: https://lnkd.in/dQZ5-Qm9 Research reference: https://lnkd.in/diU5esAj #MusicAI #MusicIndustry #AIEthics #MusicTech
-
Exciting news: Our CEO Markus is speaking at ADE Pro for the first time! 🎤 We're honored to join Amsterdam Dance Event as speakers this year - tackling a topic that doesn't get enough attention: How AI bias quietly harms women artists' careers. Amidst all the noise about AI ethics and licensing, gender bias in music recommendation systems continues to fly under the radar. Yet it's having real impact on who gets heard, who gets playlisted, and whose career takes off. What Markus will cover: 👉 How bias enters machine learning systems (even with good intentions) 👉 The ripple effects on playlist placement, search relevance, and discovery 👉 Proprietary data showing the scale of the problem 👉 Concrete steps the industry can take to change this This builds on our ISMIR 2025 research showing how popular music models associate genres disproportionately with singer gender. The data is undeniable: Current AI systems amplify inequality. But they don't have to. 📅 October 23, 14:15-14:45 📍 Felix Meritis, Amsterdam At ADE and want to dive deeper? Markus and our Director of Technology Johannes will be around for meetings and tech deep dives. Drop us a message - we'd love to connect. See you in Amsterdam! 🇳🇱 #ADE2025 #MusicAI #Bias #MusicIndustry #WomenInMusic
-
-
AI-generated music leaves fingerprints. Here's the key trends of the world's leading music AI event: ISMIR Conference 🇰🇷 👇 Our Chief AI Officer, Roman, just returned from South Korea with fresh insights on where music AI research is heading. Here are the trends that caught our attention: 1. Detecting AI-generated music 👉 AI-generated music leaves subtle "fingerprints" in the audio signal. Deezer Research's team found ways to detect these patterns using established audio analysis techniques - helping distinguish human-made from AI-made music. 2. Connecting text and sound 👉 New research is improving how AI connects what you write ("upbeat indie rock") with what music actually sounds like. This matters for anyone using text-based music search - the better this connection, the more accurate your results. 3. 3. Understanding music in detail 👉 Much research is directed at analyzing specific moments within a song - not just the overall vibe, but individual sections, transitions, and changes throughout a track. This enables more precise music matching. 4. Addressing cultural bias in music AI 👉 The Best Student Paper Award goes to a paper focusing on the mitigation of Western bias in Music Emotion Recognition. The team presented a dataset comprising participants from diverse continents and musical backgrounds, acknowledging that emotional responses to music vary across cultures. This last point particularly resonates with our own work on bias detection. As music AI becomes more global, we need systems trained on truly diverse musical traditions. These aren't just academic exercises - they're shaping how we think about building fairer, more accurate music technology at Cyanite. We're committed to continuously contributing to smarter and more inclusive music AI. 🔗's to all papers in the comments 👇 #ISMIR2025 #MusicAI #Research #MusicTech
-
-
“Dreamy, with soft piano, a subtle build-up, and a bittersweet undertone. Think rainy day reflection.” Can AI turn that description into actual music recommendations? 🤔 We've been working on this since 2021 - long before prompt-based search became a trend. The difference matters: Many prompt-based music searches use LLMs to generate tags, then search those tags. It's keyword search with extra steps. We train our models on the actual sound of music - understanding tempo shifts, analog synth timbres, and emotional trajectories. Not just converting "upbeat" into a genre filter. But here's the real breakthrough: You can now combine prompts with structured filters in one search. 🔍 Search for "dreamy, bittersweet piano" AND filter by "instrumental" or "under 3 minutes." Get music that matches your creative vision and your practical requirements. What's new: Our latest Free Text Search brings multilingual support, cultural references ("Harry Potter vibes"), and significantly better accuracy. Available now for all V7 API users. The future of music discovery isn't prompts vs tags - it's using both intelligently. 🙌 Read the full breakdown: https://lnkd.in/e2rrQifp 👀 #MusicTech #AI #MusicDiscovery #MusicSearch #MusicIndustry
-
-
A little sneak peek behind the scenes 🎬 Had a film crew in the office today, capturing what we do at Cyanite. Behind every AI company are real people solving real problems. Sometimes with better lighting than usual. More to come soon. Stay tuned. 🙌 #MusicTech #BehindTheScenes #AI #Berlin
-
-
Exciting news: Our CMO, Jakob, is heading to Japan! 🇯🇵 Jakob has been selected as part of an official German music industry delegation for "German Music and Event Industry meets Japan vol. 3" - a high-level business mission organized by the German Chamber of Commerce and Industry in Japan (AHK Japan) and Hamburg Music Business e.V. and supported by Germany's Federal Ministry of Economic Affairs and Energy. Why this matters for Cyanite: Japan is one of the world’s most sophisticated music markets, with unique approaches to discovery, curation, and technology integration. Our AI has been gaining traction in Asia, and this is a great chance to deepen our understanding of this dynamic region. Our mission: 🌏 Develop new business relationships 🤝 Explore strategic partnerships 🎌 Build cultural understanding 🚀 Learn from Japan’s innovative music tech landscape We’d love your help! 👇 • Are you based in Japan? Jakob would love to connect. • Do you know companies or people we should meet? Drop a comment or DM Jakob. • Any tips on the Japanese music tech scene are more than welcome! We would love your input to make the most of this mission. 🚀 #MusicTech #Japan #AI #MusicIndustry #Community
-
-
Live from ISMIR Conference 2025 in South Korea! 🙌 🇰🇷 Our research team, Roman, Arne, and Eylül Bektur, is presenting “Beyond Genre: Diagnosing Bias in Music Embeddings Using Concept Activation Vectors” at the world’s leading music information retrieval conferences. Our findings: Popular music AI models like MERT, Whisper, and MuQ inherit significant cultural #biases - for example, associating certain genres disproportionately with singer gender. The solution: A scalable method using Concept Activation Vectors to detect these biases, plus a debiasing strategy that works without retraining models. Being here among the global #MIR #research community reinforces how critical this work is. As AI becomes central to music discovery and recommendation, we need systems that evaluate music fairly - not through inherited cultural assumptions. This research directly informs how we build unbiased music intelligence at Cyanite. #ISMIR2025 #MusicAI #Research #SouthKorea #Bias
-
-
-
-
-
+1
-
-
Our AI team is at ISMIR Conference 2025 in South Korea presenting groundbreaking work on bias in music AI models. 🇰🇷 Roman, Arne, and Eylül are showcasing our paper “Beyond Genre: Diagnosing Bias in Music Embeddings Using Concept Activation Vectors” - now with official implementation available on GitHub. The findings: Popular music models like MERT, Whisper, and MuQ inherit significant cultural #biases. For example, they associate certain genres disproportionately with singer gender - reinforcing stereotypes rather than analyzing music objectively. To tackle the issue, a scalable method using "Concept Activation Vectors" can be provided to detect these biases, plus a debiasing strategy that works without retraining models. Why this matters: As #AI becomes central to music discovery and recommendation, we need systems that recommend music fairly - not through the lens of inherited cultural assumptions. This #research directly informs how to build unbiased musical intelligence. At Cyanite, we see the responsibility to develop systems for fair representation across all music creators. The future of music AI should amplify creativity, not perpetuate prejudice. Paper & implementation: https://lnkd.in/dN2ZcwUn At ISMIR this week? Our team would love to connect with fellow researchers - whether working on fairness in music AI or general MIR :) #ISMIR2025 #MusicAI #MusicTech #DataScience
-