Startling revelation from Kirk McElhearn: Apple Music isn’t using acoustic fingerprinting to figure out what your music is—it’s just believing whatever you tell it in metadata tags.
This is a very big problem with Apple Music. Since Apple already has the technology to match tracks using acoustic fingerprinting [in iTunes Match], they should be using this with Apple Music. Instead, it’s using scattershot matching, which results in lots of tracks showing up as being from different albums, from compilations, or totally different versions of songs.
I find it astonishing that a technology company as competent as Apple could launch a product this shoddy. That’s not even bad design: it’s bad technology. The only reason I can think that they might be doing this is so that the initial connection experience is fast—but what good is a fast first connection if the data is inaccurate?