This was the biggest timesink for me - I spend a ton of time going "why doesn't this QoS setting do anything?" Also Xcode has a CPU meter and for each thread it says "QoS unavailable". Ignore the thread QoS APIs, for realtime audio they're apparently not applicable (and do not address these issues).Also counterintuitively, the newer M1 Pro and M1 Max CPUs, with more performance cores (6-8) and fewer efficiency cores (2), seem to have a larger "medium load" range where things don't work well. Counterintuitively, it seems that under very light load, things work well, and under very heavy load, things work well, but for medium loads, there is failure. The biggest issue seems to be when your code ends up running on the efficiency cores even though you need the results ASAP, causing underruns. The implications for realtime audio of these asymmetric cores is pretty complex and can produce all sorts of weird behavior. These have 8 or more cores, but have a mix of slower and faster cores, the slower cores having lower power consumption (whether or not they are more efficient per unit of work done is unclear, it wouldn't surprise me if their efficiency was similar under full load, but anyway now I'm just guessing). Anyway, let's begin.Ī bit over a year ago Apple started making computers with their own CPUs, the M1. Or less fuss to deal with, I'm not sure why, but anyway if you search for "macOS" on this blog you'll find previous installments. Reason I never feel the need to do this on other platforms, maybe it's that they tend to have better documentation It's now time when I bitch about, and document my experiences dealing with Apple's documentation/APIs/etc. Realtime audio on macOS in the age of asymmetrical multicore CPUs
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |