Compare commits

..

110 commits

Author SHA1 Message Date
Christopher Snowhill
10464b6761 First module converted to swift, but broken 2022-06-30 02:28:16 -07:00
Christopher Snowhill
73c4360b1d [Info Inspector] Improve formatting of sample rate
Sample rate now has a locale independent formatting, and no longer uses
scientific notation for large numbers.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-30 00:28:30 -07:00
Christopher Snowhill
da21cd7341 [Play Counts] Fix reporting play counts
Play counts are guaranteed to be reported on the correct track now.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-29 23:28:11 -07:00
Christopher Snowhill
27478e5df2 Update libVGM and BASSMIDI, SF3 support
BASSMIDI now includes SF3 support, as well as several other changes.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-29 23:27:28 -07:00
Christopher Snowhill
d739e68e8e [Sandbox] Synchronize write accesses to storage
Synchronize writing to the bookmark storage to the main thread.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-29 19:41:03 -07:00
Christopher Snowhill
4ce180fb2a [Sandbox] Fix URL fragment removal function
This should be deleting from the #, including the #.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-29 19:39:47 -07:00
Christopher Snowhill
3c0ccd9d46 [Context Menu] Hook up Reset Play Counts item
Actually hook up the Reset Play Counts menu item so it actually does
something instead of just sitting there looking pretty.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-29 12:42:12 -07:00
Christopher Snowhill
b24b9744c1 [Sandbox Config] Correctly test paths for files
Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-29 12:11:01 -07:00
Christopher Snowhill
61778b7165 [Sandbox] Remove startup folder consent prompt
Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-29 12:03:18 -07:00
Christopher Snowhill
7d26150c26 [Sandbox] Support bookmarking individual files
Individually added files, directly opened by the user, may now store
bookmarks in settings.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-29 12:00:25 -07:00
Christopher Snowhill
35400e1320 Remove the file tree, as Sandbox does not permit
The Sandbox does not permit such controls to exist.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-29 12:00:12 -07:00
3d94978f82 [Info Inspector] Made fields selectable, and fixed blending issue with
album art

Signed-off-by: Kevin López Brante <kevin@kddlb.cl>
2022-06-29 11:57:32 -07:00
fe7c424843 [About Window] Fixed appearance for systems without Dark Mode
Signed-off-by: Kevin López Brante <kevin@kddlb.cl>
2022-06-29 11:57:26 -07:00
Christopher Snowhill
1a4c140708 [Sandbox] Handle file tree path config better
Handle the configuration better, by adding the path to the grants list
if it is newly configured.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-29 00:31:54 -07:00
Christopher Snowhill
29c070a616 [Sandbox] Automatically save folder bookmarks
Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-28 23:15:08 -07:00
Christopher Snowhill
8b7418857d [Sandbox] Reduce entitlements granted by default
Since App Store approval decided these suddenly matter.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-28 23:14:53 -07:00
Christopher Snowhill
a35459719d [Sandbox] Show grant dialog on launch if empty
If there are no configured paths, show the grant page on every startup.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-28 23:14:21 -07:00
Christopher Snowhill
802a86a3d8 [Playlist Loading] Process messages while loading
Process main queue messages by handling the loading in a background
queue, and sync it to the main thread periodically, while pausing to
wait for the results. This allows the file open dialog to return
immediately, and display loading progress on the status bar.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-28 20:28:10 -07:00
Christopher Snowhill
f8d2837c4e [Playlist View] Change ratings column to variable
The ratings column needs to be made variable width, for variable font
sizes. If anyone knows how to force the width to fit the current text,
I'm open to suggestions.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-28 20:27:57 -07:00
Christopher Snowhill
112366c850 [Crash Handling] Enable exceptions for debugging
Debug builds should have exceptions enabled, rather than crashing.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-28 20:27:43 -07:00
Christopher Snowhill
a1a8607a84 [About Dialog] Add needed WebKit framework
This is needed for macOS older than 11.0? 10.15? to open the About
dialog without crashing.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-28 04:26:55 -07:00
Christopher Snowhill
690153f561 [Play Info] Implement track rating system
The track ratings are stored in the same stats table as the play counts.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-28 01:55:30 -07:00
Christopher Snowhill
b33e3ff6b3 [Audio Output] Restart correct track
When restarting playback on the current track, restart the correct
track, in case restarting near the end of it.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-27 22:03:02 -07:00
Christopher Snowhill
bedfac4e33 [Play Counts] Add option to (mass) reset counts
Add option to reset counts for all selected tracks on the playlist.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-27 21:50:14 -07:00
Christopher Snowhill
66102a6cda [Play Counts] Commit play count edits to storage
Was calling commitEditing rather than commitPersistentStore, whoops.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-27 21:49:32 -07:00
Christopher Snowhill
b36ebfe740 Resource templates touched by Xcode
Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-27 21:48:43 -07:00
Christopher Snowhill
2ed78a0639 [Play Counts] Track play counts of correct track
Track play counts for the correct track, even on short tracks. Also
correctly track the play count of the last played item in the play queue
which stops with bufferChain set to nil, so the previous iteration was
not tracking it.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-27 21:46:36 -07:00
Christopher Snowhill
ae019409c5 [File Tree] Pop permission grant on setting root
Setting the root path should now pop up a permission grant dialog.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-27 16:17:19 -07:00
Christopher Snowhill
66262c2a71 [Sandbox Broker] Synchronize full access operation
Full access should be synchronized, otherwise rapid access to the same
path from different threads will cause crashes.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-27 01:00:11 -07:00
Christopher Snowhill
f567750d56 [Metadata Cache] Actually run cleanup thread
Previously, the cleanup thread was not being run. Also, only reset the
metadata deduplication store when the cache is first emptied.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-27 00:58:56 -07:00
Christopher Snowhill
2a99bb076f [Audio Output] Change converter back to Obj-C
Change converter source file back from Objective-C++ to Objective-C.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-27 00:34:19 -07:00
Christopher Snowhill
dd65665990 [Sandbox Broker] Bypass entitled paths
Include entitlement granted user folders in the permission check, so
that if the file or folder is nested under one of them, it allocates a
static permission object, rather than querying the list of configured
paths every time. This also prevents the player from popping open the
path grant / suggester dialog every time a default path is in the file
set listed, which should provide some relief to most users.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 23:52:38 -07:00
Christopher Snowhill
c477fbf553 Attempt to fix Xcode Cloud CI
Try to generate the Info.plist before xcodebuild runs.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 22:59:31 -07:00
Christopher Snowhill
03b3b43cfe Update versioning setup
Versioning now happens before building Cog itself, and goes
into the Info.plist in the project directory. The original
file became a template file which is altered any time a
build occurs.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 22:08:42 -07:00
Christopher Snowhill
9d8e278a57 Update debug.yml
Switch to building on macos-12
2022-06-26 22:07:12 -07:00
Christopher Snowhill
ed9e352543 Add Package.resolved back to repo
Guess we can't use Github actions now, because this file breaks those.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 20:25:41 -07:00
Christopher Snowhill
16fdc1de6a Add CI scripts for Xcode Cloud
Add a post clone script for Xcode Cloud

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 20:22:54 -07:00
Christopher Snowhill
fc37e96099 Automatically unpack libraries before building
This required adding the included script in every project that links to
one of the bundled libraries. The script is designed to sleep for a
while if another thread is already extracting the libraries. The script
uses a temporary file as an extraction step lock, so other instances
sleep, and then detect the libraries.updated file, which is created
before the lock is removed.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 20:11:52 -07:00
Christopher Snowhill
03a2c0c16e Updated VGMStream to r1745-47-gfa55119d
Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 15:11:23 -07:00
Christopher Snowhill
206a3e42e7 [Equalizer] Prevent crash on stop
Wait for the equalizer to be shut down properly by the main thread
before destroying it. Otherwise, the main thread could crash on stop,
due to accessing the equalizer handle while it's being torn down in the
output thread.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 14:47:34 -07:00
Christopher Snowhill
6f6b5d6986 [Visualization System] Change API a bit
Now the API makes both PCM and FFT data optional, and will do nothing if
neither are requested. Also, it now supports a latency offset in seconds
with floating point precision. The two built-in visualizations currently
request zero larency. Increasing the latency asks for even older samples
while specifying a negative count requests samples from the "future"
relative to what the listener is hearing.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 05:39:24 -07:00
Christopher Snowhill
038b0b8067 [Play Events] Don't bug on end of playlist
Don't bug out on end of playlist, when didBeginStream will receive a nil
track pointer, which should result in unsetting the current track in the
player, and not send a DidBegin notification to everything, including
the visualization views' event handlers.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 05:28:43 -07:00
Christopher Snowhill
a57827f4da [Play Counts] Fix counts for tracks with subsongs
Fix counts for tracks with subsongs from piling all the counts onto the
first subsong seen, by using the URL fragment in the filename check and
storage.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 04:37:41 -07:00
Christopher Snowhill
39be3ab962 [Audio Output] Fix for previous commit
This fixes the problem caused by the following commit:

050aaaf852

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 03:57:19 -07:00
Christopher Snowhill
17df6cde4f Fix compilation
Oops, that last suggester change broke compilation.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 03:08:20 -07:00
Christopher Snowhill
2945de085d [Sandbox] Don't try to grant access to container
Do not try to grant access to the app's container folder when searching
for paths to add.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 02:58:25 -07:00
Christopher Snowhill
c2ef7d0e61 [Sandbox] Compare to the actual user paths
Remove the sandbox reference, because the user will add folders outside
the sandbox, and we have entitlements to access these folders.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 02:56:44 -07:00
Christopher Snowhill
96a7255779 [Sandbox] Suggest URLs that are contained in CUEs
Cuesheets can now expose which URLs they contain, which may help with
sandbox path configuration. That is, if the CUE sheets are already
readable.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 02:54:19 -07:00
Christopher Snowhill
050aaaf852 [Visualization] Resample more audio if present
If upsampling the audio by a significant factor, it may be necessary to
process more than one buffer at a time, rather than lose input.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 01:09:55 -07:00
Christopher Snowhill
5f52a4be81 [Audio Output] Better handle latency oddities
The visualization buffer now holds up to 45 seconds of loop, and the
latency measurement code now caps this at 30 seconds, and restarts the
output if latency exceeds 30 seconds, such as if a sound output is
reset.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-26 01:09:01 -07:00
Christopher Snowhill
5f2335b796 [Audio Output] Play last track and stop correctly
Play last track up until it actually ends, and stop on command.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 06:42:56 -07:00
Christopher Snowhill
d33475953e [FFmpeg Decoder] Don't post redundant meta event
Don't post a metadata event on open, because inputs will relay it to the
player as an early notification bubble, which is unwanted.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 06:05:03 -07:00
Christopher Snowhill
7a56447271 [Audio Output] Fix serious memory leakage
For one thing, the example code I followed was Swift and handled auto
releasing handles in the background, while Objective-C requires manual
handle reference management.

For two, there was no autoreleasepool around the block handling the
input audio chunks, which need to be released as they are pulled out and
disposed of. This also contributed to memory leakage.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 06:00:11 -07:00
Christopher Snowhill
b86ec3340f [fdkaac] Update libfdk-aac to 2.0.2 with patches
Update fdk-aac library in dependencies package.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 05:13:22 -07:00
Christopher Snowhill
36c82a61e7 [Audio Output] Fix serious deadlock issue
There was a serious deadlock issue. Now it is fixed. Whew.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 05:12:43 -07:00
Christopher Snowhill
ab13b66755 [InputNode] Syntax code fix
This code was misformatted.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 05:10:33 -07:00
Christopher Snowhill
86de03a1ab [Play Count Info] Tabulate first seen info later
Tabulate first seen information when loading the metadata, rather than
when first adding the tracks to the playlist. This should fix first seen
information when metadata is available, as the information will be
useless without track titles.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 02:40:05 -07:00
Christopher Snowhill
0f923e6072 [Cuesheet] Greatly improve loading performance
Cuesheets were invoking a seek operation on open, rather than on first
playback, and this has a heavy toll on FFmpeg audio formats, apparently.
Defer the initial seek to the first readAudio call, and do not invoke it
if a seek was already called on that input session.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 02:38:17 -07:00
Christopher Snowhill
cb2ce5675a [FFmpeg] Fix chapter handling and seeking
Fix chapter startup, and chapter seeking.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 02:36:14 -07:00
Christopher Snowhill
1f56e5ef5a [FFmpeg] Seek including skip samples
This is essential for chapters, as otherwise, we would be skipping an
awful lot of samples every chapter, or every seek within a chapter.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 01:43:36 -07:00
Christopher Snowhill
2663b5007d [FFmpeg] Support files with chapters
Support file chapters, including metadata reading for each chapter.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 01:35:07 -07:00
Christopher Snowhill
72572c9c7f [FFmpeg] Deduce the length from the container
Determine the length of the file from the container, rather than the
individual audio stream. The former is more likely to be set than the
latter is.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-25 00:05:57 -07:00
Christopher Snowhill
3de7a34eb8 [FFmpeg] Update FFmpeg library and decoder plugin
Update based on newest changes from upstream.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 23:51:12 -07:00
Christopher Snowhill
86d8f04966 [Audio Output] Correctly configure WAVE layouts
Correctly configure AVFoundation with the channel layouts supported by
WAVEFORMATEXTENSIBLE speaker position flags, which includes varied
formats supported by FFmpeg and Core Audio inputs.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 23:29:32 -07:00
Christopher Snowhill
b55955ef1c [Audio Output] Correctly delay layout updates
Channel layout updates should be delayed when resampling, just like
sample format changes are.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 23:29:18 -07:00
Christopher Snowhill
8e1175bbd4 [FFmpeg] Update minimum platform for x86_64
Update minimum platform version to macOS 10.13.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 22:33:50 -07:00
Christopher Snowhill
62e2880b49 [FFmpeg] Enable TrueHD decoder and demuxer
Oops, somehow I didn't enable TrueHD support.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 22:22:29 -07:00
Christopher Snowhill
1ac3e5cd22 [FFmpeg] Update libfdk-aac fixed point patch
Update this patch to the latest FFmpeg master source, and update to use
fmtconvert instead of a naive for loop.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 22:18:25 -07:00
Christopher Snowhill
d2eb4af3d5 [Audio Output] Stop immediately, and fix deadlocks
Stop output when requested, except on natural completion of the last
track in the play queue. Also fix deadlocks with stopping and
restarting.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 19:15:20 -07:00
Christopher Snowhill
50b7390181 [Vorbis / Opus] Do not assume text encoding
Stream metadata encoding may not be UTF-8, even though the Vorbis
Comment specification clearly calls for this.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 19:15:15 -07:00
Christopher Snowhill
abf80c19ac Move static and dynamic libaries to archive
Please remember to unpack the archive before building, and
if it is updated by a future version.
2022-06-24 17:04:57 -07:00
Christopher Snowhill
b21a02fe1b [Sandbox] Change preference dialog descriptions
Make the descriptions more apt to what they do.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 17:04:52 -07:00
Christopher Snowhill
dd35639174 [OpenMPT / OpenMPT Legacy] Fix include paths
Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 17:04:48 -07:00
Christopher Snowhill
43433c244e [mpg123] Fix include paths 2022-06-24 17:04:44 -07:00
Christopher Snowhill
870a5afed7 [Sandbox Broker] Fix deadlock and crash
The crash was because we weren't copying the results array before
iterating over it, and the deadlock was because this was forced to go
through the main thread, rather than going through its calling thread,
which could lock up if the main thread was busy working with the Sandbox
Broker object.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 17:04:39 -07:00
Christopher Snowhill
b9ef5853d6 [Metadata] Commit first seen date for whole batch
Commit only once the entire batch is loaded and processed. Also commit
using the correct function.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 17:04:35 -07:00
Christopher Snowhill
1a9c73d166 [OpenMPT / vgmstream] Made libraries pre-built
Made the OpenMPT / legacy OpenMPT and mpg123 libraries pre-built.
Changed the OpenMPT and vgmstream plugins to import the libraries as
they are now. Made mpg123 embedded and imported by the main binary,
since it's now shared by two plugins.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 17:04:30 -07:00
Christopher Snowhill
ec393d186a [Sandbox Broker] Copy results array
Hopefully this heads off a crash elsewhere.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 17:04:16 -07:00
Christopher Snowhill
dfb773e9cb [Audio Output] Properly handle end of playlist
Handle audio on the end of the playlist, flushing playback until all
output stops.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 03:46:01 -07:00
Christopher Snowhill
438b142558 [Audio Output] Synchronize access, report latency
Report resampler latency properly, and synchronize access to the
resampler objects.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 03:45:16 -07:00
Christopher Snowhill
26a63e85b7 [Visualization] Resample all visualizer audio
Visualizer audio is now resampled to 44100 Hz, for consistency across
the system.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 03:43:50 -07:00
Christopher Snowhill
ccbeaf16dc [Audio Output] Resample unsupported sample rates
These rates are too high for Apple's output routines, for some reason.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 02:46:23 -07:00
Christopher Snowhill
cc5de69e9f [Core Data Store] Fix startup playlist pruning
The playlist was being pruned of entries marked for deletion, but they
were not being pruned from the set that was then added to the player.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 00:34:30 -07:00
Christopher Snowhill
80adb85b36 [Path Suggester] Automatically pop where required
The Path Suggester will now automatically open when new files are added
to the playlist and a given path is not in the sandbox settings. It will
also pop for both the File Tree and MIDI SoundFont path configuration
settings being changed.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-24 00:29:50 -07:00
Christopher Snowhill
f3f3d436ba [Sandbox Broker] Synchronize storage access
Synchronize storage access to main thread only, to prevent enumeration
from hitting a case of the main thread writing to the storage.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-23 23:35:26 -07:00
be6453e048 [Sandbox] Fixed path suggester window
Added a title to the window, make the table view resize properly, and
remove the font size inheritance from the main window.

Signed-off-by: Kevin López Brante <kevin@kddlb.cl>
2022-06-23 23:26:16 -07:00
Christopher Snowhill
8af32e8d2e Replace Core Audio output with Core Media runtime
The output now uses AVSampleBufferAudioRenderer to play all formats, and
uses that to resample. It also supports Spatial Audio on macOS 12.0 or
newer. Note that there are some outstanding bugs with Spatial Audio
support. Namely that it appears to be limited to only 192 kHz at mono or
stereo, or 352800 Hz at surround configurations. This breaks DSD64
playback at stereo formats, as well as possibly other things. This is
entirely an Apple bug. I have reported it to Apple with reference code
FB10441301 for reference, in case anyone else wants to complain that it
isn't fixed.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-23 23:22:41 -07:00
Christopher Snowhill
5b6dacd29c Cog now requires macOS 10.13 as a minimum version
All optional fallback code for older versions has also been removed, and
everything now assumes 10.13.0 or newer. Some cases are still included
for point releases, such as 10.13.2.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-22 22:54:32 -07:00
Christopher Snowhill
ff44bc4d34 Updated VGMStream to r1745-37-g776c4d8c
Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-22 19:33:32 -07:00
Christopher Snowhill
62824a94bd Serialize persistent store update to main thread
This needs to be called on the main thread, as something may or may not
be enumerating over the data while this thread decides to call it.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-22 19:11:32 -07:00
Christopher Snowhill
903bc9cba5 [Playlist Info Loader] Do not clear if loading
Do not clear the progress indicator if a loading task is already running
in the background, but instead return without doing anything.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-22 19:05:27 -07:00
Christopher Snowhill
f274a8ef73 Sync version number with main branch 2022-06-22 16:28:08 -07:00
0317f2a649 [About Window] Fix
Pull request #281 by @nevack.
2022-06-22 16:11:59 -07:00
b484a0be44 [About Window] Reorganized credits and added @nevack and myself in them
Signed-off-by: Kevin López Brante <kevin@kddlb.cl>
2022-06-22 16:11:38 -07:00
Dzmitry Neviadomski
36a9411b14 Fix runtime warnings in Window/AboutWindowController.xib
Fix typo in File Owner class name and remove absent outlet.
2022-06-22 16:11:24 -07:00
64fefce18d Merge pull request #280 from losnoco/nevack/about-window
AboutWindow adjustments
2022-06-22 16:07:16 -07:00
Dzmitry Neviadomski
3a6e41cabd AboutWindow adjustments
Allow opening links in default browser
Close window on Esc
Add rounded corners
2022-06-22 16:06:26 -07:00
Christopher Snowhill
632ba36f13 Changed updater script to handle new version strings
New version strings are in a different place, and Sparkle will no longer
be including the Git hash in the CFBundleVersion query, so we must get
it from the ZIP filename.
2022-06-22 01:18:43 -07:00
Christopher Snowhill
271b9b34d0 One last attempt to fix CI
This should fix building. I don't know how I missed those.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-21 23:51:05 -07:00
Christopher Snowhill
8d031f394b Assign blank development team in project files
Hopefully this blank assignment will spare these files from being
touched by Xcode again in the future, when the variable in question is
imported from a developer supplied configuration file.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-21 23:27:53 -07:00
Christopher Snowhill
f2c6ae39c3 Remove developer supplied configuration file
This file should not be referenced directly by projects, otherwise it
will be expected to exist, even in CI.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-21 23:26:51 -07:00
Christopher Snowhill
bb95270747 [Volume Control] Only initialize view once
Only initialize viewController once, the first time the volume control
is opened. Re-initializing it can cause an error assigning it as first
responder to the volume slider popover view.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-21 22:52:05 -07:00
Christopher Snowhill
aa36e3ce10 Completely overhaul code signing practices
Redesign the code signing from the ground up. Now all bundles and their
embedded frameworks import the Shared.xcconfig file and enable its
settings, so they may be signed with Apple Development instead of sign
to run locally. This apparently isn't necessary for frameworks which are
embedded in the main app bundle directly, only for the bundles and their
frameworks.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-21 22:42:33 -07:00
Christopher Snowhill
da8a4dffdf Remove deep forced code signing option
This option should no longer be needed for anything.
2022-06-21 19:40:15 -07:00
a4692b80a4 [Main Menu] Added Privacy Policy link in App Menu
Signed-off-by: Kevin López Brante <kevin@kddlb.cl>
2022-06-21 19:31:06 -07:00
de5cce8351 [About Dialog] Switched to WebView for credits
Signed-off-by: Kevin López Brante <kevin@kddlb.cl>
2022-06-21 19:30:58 -07:00
Christopher Snowhill
05da7450da [Crashlytics] Require asking user consent
Require asking user consent for data transmission on first launch, or
otherwise disable sending crash reports by default.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-21 19:16:23 -07:00
Christopher Snowhill
2b8156e86c [Info Plist] Auto format XML escapes
Automatically format any XML escapes of file type association names.
Adjust Info.plist to account for this change.

Signed-off-by: Christopher Snowhill <kode54@gmail.com>
2022-06-21 19:14:00 -07:00
Christopher Snowhill
d59b5335e9 Revert "Removed Sparkle"
This reverts commit b54ee58ec3.
2022-06-21 18:00:30 -07:00
Christopher Snowhill
bc9e7b5d67 Revert "Remove stray entitlement from Sparkle"
This reverts commit 5ea6c9dde7.
2022-06-21 18:00:09 -07:00
3094 changed files with 382718 additions and 183629 deletions

View file

@ -3,7 +3,7 @@ name: Feedback
about: Report bugs or suggest new features
title: ''
labels:
assignees: kode54
assignees: kode54, nevack
---

View file

@ -1,44 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 24.2.1, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
viewBox="0 0 2809.9 600" style="enable-background:new 0 0 2809.9 600;" xml:space="preserve">
<style type="text/css">
.st0{fill:#FF6336;}
.st1{fill:#FFC501;}
.st2{fill:#A4A14A;}
</style>
<g>
<polygon class="st0" points="393.5,468.8 524.7,468.8 524.7,376.1 355.1,376.1 262.3,468.8 169.6,376.1 0,376.1 0,468.8
131.2,468.8 262.3,600 "/>
<rect y="190.6" class="st1" width="524.7" height="92.8"/>
<rect y="5.1" class="st2" width="524.7" height="92.8"/>
</g>
<path d="M733.3,5.1h83.4v455.2h-83.4V5.1z"/>
<path d="M946.7,447.3c-26.3-14.5-47.2-34.5-62.6-59.7c-15.4-25.3-23.1-53.1-23.1-83.4s7.7-58.2,23.1-83.4
c15.4-25.3,36.2-45.2,62.6-59.7c26.3-14.5,54.9-21.8,85.7-21.8c30.8,0,59.3,7.3,85.7,21.8c26.3,14.5,47.2,34.5,62.6,59.7
c15.4,25.3,23.1,53.1,23.1,83.4s-7.7,58.2-23.1,83.4c-15.4,25.3-36.3,45.2-62.6,59.7c-26.3,14.5-54.9,21.8-85.7,21.8
C1001.5,469.1,973,461.8,946.7,447.3z M1076.2,380.9c13.3-7.4,23.9-17.8,31.9-31.3s12-28.7,12-45.5s-4-32-12-45.5
s-18.6-23.9-31.9-31.3c-13.3-7.4-27.9-11.1-43.9-11.1s-30.7,3.7-43.9,11.1c-13.3,7.4-23.9,17.8-31.9,31.3s-12,28.7-12,45.5
c0,16.9,4,32,12,45.5s18.6,23.9,31.9,31.3c13.3,7.4,27.9,11.1,43.9,11.1C1048.3,391.9,1063,388.3,1076.2,380.9z"/>
<path d="M1247.9,5.1h83.4v271.2L1440,147.9h99.2l-122.6,144.8l131.5,167.5h-106.8l-110-144.8v144.8h-83.4L1247.9,5.1L1247.9,5.1z"/>
<path d="M1626.9,448.8c-23.4-13.5-42.4-32.8-56.9-57.8c-14.5-25.1-21.8-54-21.8-86.9c0-29.9,7-57.5,20.9-82.8s32.9-45.3,56.9-60.1
c24-14.7,50.3-22.1,79-22.1c20.2,0,38.7,3.4,55.3,10.1c16.6,6.7,29.4,15.8,38.3,27.2V148h83.4v312.3h-83.4v-28.4
c-13.1,12.2-27,21.5-41.7,27.8c-14.8,6.3-33.7,9.5-56.9,9.5C1674.6,469.1,1650.2,462.3,1626.9,448.8z M1778.9,366.7
c15.6-16.9,23.4-37.7,23.4-62.6s-7.8-45.7-23.4-62.6c-15.6-16.8-36.2-25.3-61.9-25.3c-25.7,0-46.4,8.4-62,25.3s-23.4,37.7-23.4,62.6
s7.8,45.7,23.4,62.6s36.2,25.3,62,25.3C1742.6,391.9,1763.3,383.5,1778.9,366.7z"/>
<path d="M1942.6,5.1h83.4v455.2h-83.4V5.1z"/>
<path d="M2091.2,89.8C2081,79.7,2076,67.4,2076,53.1c0-14.7,5.1-27.3,15.2-37.6C2101.3,5.2,2113.5,0,2127.8,0
c14.7,0,27.3,5.2,37.6,15.5s15.5,22.9,15.5,37.6c0,14.3-5.2,26.5-15.5,36.7c-10.3,10.1-22.9,15.2-37.6,15.2
C2113.5,104.9,2101.3,99.9,2091.2,89.8z M2086.7,147.9h83.4v312.3h-83.4V147.9z"/>
<path d="M2227.1,438.7l19-78.4h3.8c27.4,21.1,55.4,31.6,84.1,31.6c11.8,0,21.4-2.2,28.8-6.6c7.4-4.4,11.1-10.8,11.1-19.3
c0-8.8-4.3-16-13-21.5c-8.6-5.5-24.8-12.2-48.4-20.2c-24-8-42.7-19.6-55.9-34.8c-13.3-15.2-19.9-33.1-19.9-53.7
c0-29.1,10.8-52.5,32.6-70.2c21.7-17.7,49.2-26.6,82.5-26.6c16.9,0,31.8,1.6,44.9,4.7c13.1,3.2,25.5,8.3,37.3,15.5l3.2,79.6h-4.4
c-15.2-9.7-28.7-17-40.5-21.8s-25.1-7.3-39.8-7.3c-10.5,0-19.2,2.1-25.9,6.3s-10.1,9.7-10.1,16.4c0,8.9,4.2,16.1,12.7,21.8
c8.4,5.7,24.2,12.5,47.4,20.5c26.5,8.9,46.7,19.6,60.4,32.2s20.5,32.2,20.5,58.8c0,21.9-5.5,40.7-16.4,56.3
c-11,15.6-25.4,27.3-43.3,35.1c-17.9,7.8-37.6,11.7-59.1,11.7C2294.9,469.1,2257.8,459,2227.1,438.7z"/>
<path d="M2574.8,446.9c-26.1-14.7-46.7-34.9-61.6-60.4c-15-25.5-22.4-53.8-22.4-85c0-30.3,7.1-57.8,21.2-82.5
c14.1-24.7,33.7-44.1,58.8-58.5c25.1-14.3,53.4-21.5,85-21.5c32,0,59.7,7.5,83.1,22.4c23.4,15,41.1,34.9,53.1,59.7
c12,24.9,18,51.6,18,80.3v24.7h-239.6c5.1,23.6,15.9,41.5,32.6,53.7c16.6,12.2,38.7,18.3,66.1,18.3c41.7,0,78.2-13.7,109.4-41.1h8.9
l-3.2,79c-19,11-39,19.2-60.1,24.7s-41.3,8.2-60.7,8.2C2630.4,469.1,2600.9,461.7,2574.8,446.9z M2726.5,269.3
c-2.1-19-10-33.8-23.7-44.6c-13.7-10.7-30.5-16.1-50.3-16.1c-19.4,0-36.4,5.2-50.9,15.5s-24.3,25.4-29.4,45.2H2726.5z"/>
</svg>

Before

Width:  |  Height:  |  Size: 3.6 KiB

View file

@ -11,16 +11,12 @@ on:
jobs:
build:
name: Build Universal Cog.app
runs-on: macos-15
runs-on: macos-12
env:
XCODE_DERIVEDDATA_PATH: build
steps:
- name: Switch to Xcode 16
uses: maxim-lobanov/setup-xcode@v1
with:
xcode-version: 16
- name: Check out repository
uses: actions/checkout@v4
uses: actions/checkout@v2
with:
submodules: recursive
- name: Unpack libraries
@ -50,7 +46,7 @@ jobs:
$XCODE_DERIVEDDATA_PATH/Build/Products/Debug/Cog.app
$XCODE_DERIVEDDATA_PATH/Cog.zip
- name: Upload Artifact
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v2
with:
name: Cog
path: ${{ env.XCODE_DERIVEDDATA_PATH }}/Cog.zip

19
.gitignore vendored
View file

@ -1,13 +1,12 @@
.DS_Store
xcuserdata
/build
./build
# Special cog exceptions
!Frameworks/OpenMPT/OpenMPT/build
# User-specific xcconfig files
Xcode-config/DEVELOPMENT_TEAM.xcconfig
Xcode-config/SENTRY_SETTINGS.xcconfig
# Plist derived from template at build time
/Info.plist
@ -20,7 +19,6 @@ Xcode-config/SENTRY_SETTINGS.xcconfig
# The project will unpack these before building, if necessary
/ThirdParty/BASS/libbass.dylib
/ThirdParty/BASS/libbass_mpc.dylib
/ThirdParty/BASS/libbassflac.dylib
/ThirdParty/BASS/libbassmidi.dylib
/ThirdParty/BASS/libbassopus.dylib
@ -32,11 +30,11 @@ Xcode-config/SENTRY_SETTINGS.xcconfig
/ThirdParty/fdk-aac/lib/libfdk-aac.dylib
/ThirdParty/fdk-aac/lib/libfdk-aac.la
/ThirdParty/fdk-aac/lib/pkgconfig/fdk-aac.pc
/ThirdParty/ffmpeg/lib/libavcodec.61.dylib
/ThirdParty/ffmpeg/lib/libavformat.61.dylib
/ThirdParty/ffmpeg/lib/libavutil.59.dylib
/ThirdParty/ffmpeg/lib/libswresample.5.dylib
/ThirdParty/flac/lib/libFLAC.12.dylib
/ThirdParty/ffmpeg/lib/libavcodec.59.dylib
/ThirdParty/ffmpeg/lib/libavformat.59.dylib
/ThirdParty/ffmpeg/lib/libavutil.57.dylib
/ThirdParty/ffmpeg/lib/libswresample.4.dylib
/ThirdParty/flac/lib/libFLAC.8.dylib
/ThirdParty/libid3tag/lib/libid3tag.a
/ThirdParty/libmad/lib/libmad.a
/ThirdParty/libopenmpt/lib/libopenmpt.a
@ -48,9 +46,4 @@ Xcode-config/SENTRY_SETTINGS.xcconfig
/ThirdParty/ogg/lib/libogg.0.dylib
/ThirdParty/opus/lib/libopus.0.dylib
/ThirdParty/opusfile/lib/libopusfile.0.dylib
/ThirdParty/rubberband/lib/librubberband.3.dylib
/ThirdParty/speex/libspeex.a
/ThirdParty/vorbis/lib/libvorbisfile.3.dylib
/ThirdParty/vorbis/lib/libvorbis.0.dylib
/ThirdParty/soxr/lib/libsoxr.0.dylib
/ThirdParty/WavPack/lib/libwavpack.a

7
.gitmodules vendored
View file

@ -3,7 +3,7 @@
url = https://github.com/kode54/mgba.git
[submodule "Frameworks/AdPlug/AdPlug/adplug"]
path = Frameworks/AdPlug/AdPlug/adplug
url = https://github.com/kode54/adplug.git
url = https://github.com/adplug/adplug.git
[submodule "Frameworks/libbinio/libbinio/libbinio"]
path = Frameworks/libbinio/libbinio/libbinio
url = https://github.com/adplug/libbinio.git
@ -15,7 +15,10 @@
url = https://github.com/Thealexbarney/LibAtrac9.git
[submodule "Frameworks/shpakovski/MASShortcut"]
path = Frameworks/shpakovski/MASShortcut
url = https://github.com/kode54/MASShortcut.git
url = https://github.com/shpakovski/MASShortcut.git
[submodule "Frameworks/libsidplayfp/sidplayfp"]
path = Frameworks/libsidplayfp/sidplayfp
url = https://github.com/kode54/libsidplayfp.git
[submodule "Audio/ThirdParty/r8brain-free-src"]
path = Audio/ThirdParty/r8brain-free-src
url = https://github.com/kode54/r8brain-free-src

BIN
AboutCog.jp2 Normal file

Binary file not shown.

View file

@ -2,11 +2,11 @@
#import <Cocoa/Cocoa.h>
@class FileTreeViewController;
@class PlaybackController;
@class PlaylistController;
@class PlaylistView;
@class PlaylistLoader;
@class SUUpdater;
@class PreferencesController;
@interface AppController : NSObject {
@ -19,7 +19,6 @@
IBOutlet NSWindow *mainWindow;
IBOutlet NSWindow *miniWindow;
IBOutlet NSSplitView *mainView;
IBOutlet NSSegmentedControl *playbackButtons;
IBOutlet NSButton *fileButton;
@ -37,7 +36,6 @@
IBOutlet NSMenuItem *showArtistColumn;
IBOutlet NSMenuItem *showAlbumColumn;
IBOutlet NSMenuItem *showGenreColumn;
IBOutlet NSMenuItem *showPlayCountColumn;
IBOutlet NSMenuItem *showLengthColumn;
IBOutlet NSMenuItem *showTrackColumn;
IBOutlet NSMenuItem *showYearColumn;
@ -47,7 +45,7 @@
IBOutlet NSWindowController *spotlightWindowController;
IBOutlet FileTreeViewController *fileTreeViewController;
IBOutlet SUUpdater *updater;
IBOutlet PreferencesController *preferencesController;
@ -67,6 +65,11 @@
- (IBAction)delEntries:(id)sender;
- (IBAction)savePlaylist:(id)sender;
- (IBAction)openLiberapayPage:(id)sender;
- (IBAction)openPaypalPage:(id)sender;
- (IBAction)openKofiPage:(id)sender;
- (IBAction)openPatreonPage:(id)sender;
- (IBAction)privacyPolicy:(id)sender;
- (IBAction)feedback:(id)sender;
@ -103,11 +106,6 @@
- (void)showPathSuggester;
+ (void)globalShowPathSuggester;
- (void)selectTrack:(id)sender;
- (IBAction)showRubberbandSettings:(id)sender;
+ (void)globalShowRubberbandSettings;
@property NSWindow *mainWindow;
@property NSWindow *miniWindow;

View file

@ -1,17 +1,12 @@
#import "AppController.h"
#import "Cog-Swift.h"
#import "FileTreeController.h"
#import "FileTreeOutlineView.h"
#import "FileTreeViewController.h"
#import "FontSizetoLineHeightTransformer.h"
#import "OpenURLPanel.h"
#import "PathNode.h"
#import "PlaybackController.h"
#import "PlaylistController.h"
#import "PlaylistEntry.h"
#import "PlaylistLoader.h"
#import "PlaylistView.h"
#import "RubberbandEngineTransformer.h"
#import "SQLiteStore.h"
#import "SandboxBroker.h"
#import "SpotlightWindowController.h"
@ -28,13 +23,12 @@
#import "Shortcuts.h"
#import <MASShortcut/Shortcut.h>
#import <MASShortcut/MASDictionaryTransformer.h>
#import <Sparkle/Sparkle.h>
#import "PreferencesController.h"
#import "FeedbackController.h"
@import Sentry;
@import Firebase;
void *kAppControllerContext = &kAppControllerContext;
@ -75,14 +69,6 @@ static AppController *kAppController = nil;
NSValueTransformer *numberHertzToStringTransformer = [[NumberHertzToStringTransformer alloc] init];
[NSValueTransformer setValueTransformer:numberHertzToStringTransformer
forName:@"NumberHertzToStringTransformer"];
NSValueTransformer *rubberbandEngineEnabledTransformer = [[RubberbandEngineEnabledTransformer alloc] init];
[NSValueTransformer setValueTransformer:rubberbandEngineEnabledTransformer
forName:@"RubberbandEngineEnabledTransformer"];
NSValueTransformer *rubberbandEngineHiddenTransformer = [[RubberbandEngineHiddenTransformer alloc] init];
[NSValueTransformer setValueTransformer:rubberbandEngineHiddenTransformer
forName:@"RubberbandEngineHiddenTransformer"];
}
- (id)init {
self = [super init];
@ -110,10 +96,8 @@ static AppController *kAppController = nil;
[p beginSheetModalForWindow:mainWindow
completionHandler:^(NSInteger result) {
if(result == NSModalResponseOK) {
NSDictionary *loadEntryData = @{@"entries": [p URLs],
@"sort": @(YES),
@"origin": @(URLOriginExternal)};
[self->playlistController performSelectorInBackground:@selector(addURLsInBackground:) withObject:loadEntryData];
[self->playlistLoader willInsertURLs:[p URLs] origin:URLOriginExternal];
[self->playlistLoader didInsertURLs:[self->playlistLoader addURLs:[p URLs] sort:YES] origin:URLOriginExternal];
} else {
[p close];
}
@ -150,10 +134,8 @@ static AppController *kAppController = nil;
- (void)openURLPanelDidEnd:(OpenURLPanel *)panel returnCode:(int)returnCode contextInfo:(void *)contextInfo {
if(returnCode == NSModalResponseOK) {
NSDictionary *loadEntriesData = @{ @"entries": @[[panel url]],
@"sort": @(NO),
@"origin": @(URLOriginExternal) };
[playlistController performSelectorInBackground:@selector(addURLsInBackground:) withObject:loadEntriesData];
[playlistLoader willInsertURLs:@[[panel url]] origin:URLOriginExternal];
[playlistLoader didInsertURLs:[playlistLoader addURLs:@[[panel url]] sort:NO] origin:URLOriginExternal];
}
}
@ -169,13 +151,21 @@ static AppController *kAppController = nil;
return [key isEqualToString:@"currentEntry"];
}
static BOOL consentLastEnabled = NO;
- (void)awakeFromNib {
[[NSUserDefaults standardUserDefaults] registerDefaults:@{ @"sentryConsented": @(NO),
@"sentryAskedConsent": @(NO) }];
#if DEBUG
[[NSUserDefaults standardUserDefaults] registerDefaults:@{ @"NSApplicationCrashOnExceptions": @(NO) }];
#else
[[NSUserDefaults standardUserDefaults] registerDefaults:@{ @"NSApplicationCrashOnExceptions": @(YES) }];
#endif
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.sentryConsented" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kAppControllerContext];
[FIRApp configure];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.crashlyticsConsented" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kAppControllerContext];
#ifdef DEBUG
// Prevent updates automatically in debug builds
[updater setAutomaticallyChecksForUpdates:NO];
#endif
[[totalTimeField cell] setBackgroundStyle:NSBackgroundStyleRaised];
@ -186,10 +176,6 @@ static BOOL consentLastEnabled = NO;
[randomizeButton setToolTip:NSLocalizedString(@"RandomizeButtonTooltip", @"")];
[fileButton setToolTip:NSLocalizedString(@"FileButtonTooltip", @"")];
[self registerDefaultHotKeys];
[self migrateHotKeys];
[self registerHotKeys];
(void)[spotlightWindowController init];
@ -219,7 +205,6 @@ static BOOL consentLastEnabled = NO;
if(!sandboxBroker) {
ALog(@"Sandbox broker init failed.");
}
[SandboxBroker cleanupFolderAccess];
[[playlistController undoManager] enableUndoRegistration];
@ -238,7 +223,7 @@ static BOOL consentLastEnabled = NO;
NSError *error = nil;
NSArray *results = [playlistController.persistentContainer.viewContext executeFetchRequest:request error:&error];
if(results && [results count] > 0) {
if(results && [results count] == 1) {
PlaylistEntry *pe = results[0];
if([[NSUserDefaults standardUserDefaults] boolForKey:@"resumePlaybackOnStartup"]) {
[playbackController playEntryAtIndex:pe.index startPaused:(lastStatus == CogStatusPaused) andSeekTo:@(pe.currentPosition)];
@ -249,13 +234,6 @@ static BOOL consentLastEnabled = NO;
pe.countAdded = NO;
[playlistController commitPersistentStore];
}
// Bug fix
if([results count] > 1) {
for(size_t i = 1; i < [results count]; ++i) {
PlaylistEntry *pe = results[i];
[pe setCurrent:NO];
}
}
}
}
@ -267,50 +245,11 @@ static BOOL consentLastEnabled = NO;
[self setFloatingMiniWindow:[[NSUserDefaults standardUserDefaults]
boolForKey:@"floatingMiniWindow"]];
// We need file tree view to restore its state here
// so attempt to access file tree view controller's root view
// to force it to read nib and create file tree view for us
//
// TODO: there probably is a more elegant way to do all this
// but i'm too stupid/tired to figure it out now
[fileTreeViewController view];
FileTreeOutlineView *outlineView = [fileTreeViewController outlineView];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(nodeExpanded:) name:NSOutlineViewItemDidExpandNotification object:outlineView];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(nodeCollapsed:) name:NSOutlineViewItemDidCollapseNotification object:outlineView];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(updateDockMenu:) name:CogPlaybackDidBeginNotificiation object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(updateDockMenu:) name:CogPlaybackDidStopNotificiation object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(updateDockMenu:) name:CogPlaybackDidBeginNotficiation object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(updateDockMenu:) name:CogPlaybackDidStopNotficiation object:nil];
[self updateDockMenu:nil];
NSArray *expandedNodesArray = [[NSUserDefaults standardUserDefaults] valueForKey:@"fileTreeViewExpandedNodes"];
if(expandedNodesArray) {
expandedNodes = [[NSMutableSet alloc] initWithArray:expandedNodesArray];
} else {
expandedNodes = [[NSMutableSet alloc] init];
}
DLog(@"Nodes to expand: %@", [expandedNodes description]);
DLog(@"Num of rows: %ld", [outlineView numberOfRows]);
if(!outlineView) {
DLog(@"outlineView is NULL!");
}
[outlineView reloadData];
for(NSInteger i = 0; i < [outlineView numberOfRows]; i++) {
PathNode *pn = [outlineView itemAtRow:i];
NSString *str = [[pn URL] absoluteString];
if([expandedNodes containsObject:str]) {
[outlineView expandItem:pn];
}
}
[self addObserver:self
forKeyPath:@"playlistController.currentEntry"
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
@ -325,45 +264,10 @@ static BOOL consentLastEnabled = NO;
return;
}
if([keyPath isEqualToString:@"values.sentryConsented"]) {
BOOL enabled = [[NSUserDefaults standardUserDefaults] boolForKey:@"sentryConsented"];
if(enabled != consentLastEnabled) {
if(enabled) {
[SentrySDK startWithConfigureOptions:^(SentryOptions *options) {
options.dsn = @"https://b5eda1c2390eb965a74dd735413b6392@cog-analytics.losno.co/3";
options.debug = YES; // Enabled debug when first installing is always helpful
// Temporary until there's a better solution
options.enableAppHangTracking = NO;
// Set tracesSampleRate to 1.0 to capture 100% of transactions for performance monitoring.
// We recommend adjusting this value in production.
options.tracesSampleRate = @1.0;
options.profilesSampleRate = @1.0;
// Adds IP for users.
// For more information, visit: https://docs.sentry.io/platforms/apple/data-management/data-collected/
options.sendDefaultPii = YES;
// And now to set up user feedback prompting
options.onCrashedLastRun = ^void(SentryEvent * _Nonnull event) {
// capture user feedback
FeedbackController *fbcon = [[FeedbackController alloc] init];
[fbcon performSelectorOnMainThread:@selector(showWindow:) withObject:nil waitUntilDone:YES];
if([fbcon waitForCompletion]) {
SentryFeedback *feedback = [[SentryFeedback alloc] initWithMessage:[fbcon comments] name:[fbcon name] email:[fbcon email] source:SentryFeedbackSourceCustom associatedEventId:event.eventId attachments:nil];
[SentrySDK captureFeedback:feedback];
}
};
}];
} else {
if([SentrySDK isEnabled]) {
[SentrySDK close];
}
}
consentLastEnabled = enabled;
}
if([keyPath isEqualToString:@"values.crashlyticsConsented"]) {
BOOL enabled = [[NSUserDefaults standardUserDefaults] boolForKey:@"crashlyticsConsented"];
[[FIRCrashlytics crashlytics] setCrashlyticsCollectionEnabled:enabled];
[FIRAnalytics setAnalyticsCollectionEnabled:enabled];
} else if([keyPath isEqualToString:@"playlistController.currentEntry"]) {
PlaylistEntry *entry = playlistController.currentEntry;
NSString *appTitle = NSLocalizedString(@"CogTitle", @"");
@ -433,20 +337,6 @@ static BOOL consentLastEnabled = NO;
}
}
- (void)nodeExpanded:(NSNotification *)notification {
PathNode *node = [[notification userInfo] objectForKey:@"NSObject"];
NSString *url = [[node URL] absoluteString];
[expandedNodes addObject:url];
}
- (void)nodeCollapsed:(NSNotification *)notification {
PathNode *node = [[notification userInfo] objectForKey:@"NSObject"];
NSString *url = [[node URL] absoluteString];
[expandedNodes removeObject:url];
}
- (NSApplicationTerminateReply)applicationShouldTerminate:(NSApplication *)sender {
if(playbackController.progressOverall) {
[playbackController.progressOverall addObserver:self forKeyPath:@"finished" options:0 context:kAppControllerContext];
@ -520,7 +410,6 @@ static BOOL consentLastEnabled = NO;
DLog(@"Saving expanded nodes: %@", [expandedNodes description]);
[[NSUserDefaults standardUserDefaults] setValue:[expandedNodes allObjects] forKey:@"fileTreeViewExpandedNodes"];
// Workaround window not restoring it's size and position.
[miniWindow setContentSize:NSMakeSize(miniWindow.frame.size.width, 1)];
[miniWindow saveFrameUsingName:@"Mini Window"];
@ -539,10 +428,8 @@ static BOOL consentLastEnabled = NO;
- (BOOL)application:(NSApplication *)theApplication openFile:(NSString *)filename {
NSArray *urls = @[[NSURL fileURLWithPath:filename]];
NSDictionary *loadEntriesData = @{ @"entries": urls,
@"sort": @(NO),
@"origin": @(URLOriginExternal) };
[playlistController performSelectorInBackground:@selector(addURLsInBackground:) withObject:loadEntriesData];
[playlistLoader willInsertURLs:urls origin:URLOriginExternal];
[playlistLoader didInsertURLs:[playlistLoader addURLs:urls sort:NO] origin:URLOriginExternal];
return YES;
}
@ -551,50 +438,31 @@ static BOOL consentLastEnabled = NO;
NSMutableArray *urls = [NSMutableArray array];
for(NSString *filename in filenames) {
NSURL *url = nil;
if([[NSFileManager defaultManager] fileExistsAtPath:filename]) {
url = [NSURL fileURLWithPath:filename];
} else {
if([filename hasPrefix:@"/http/::"] ||
[filename hasPrefix:@"/https/::"]) {
// Stupid Carbon bodge for AppleScript
NSString *method = nil;
NSString *server = nil;
NSString *path = nil;
NSScanner *objScanner = [NSScanner scannerWithString:filename];
if(![objScanner scanString:@"/" intoString:nil] ||
![objScanner scanUpToString:@"/" intoString:&method] ||
![objScanner scanString:@"/::" intoString:nil] ||
![objScanner scanUpToString:@":" intoString:&server] ||
![objScanner scanString:@":" intoString:nil]) {
continue;
[urls addObject:[NSURL fileURLWithPath:filename]];
}
[objScanner scanUpToCharactersFromSet:[NSCharacterSet illegalCharacterSet] intoString:&path];
// Colons in server were converted to shashes, convert back
NSString *convertedServer = [server stringByReplacingOccurrencesOfString:@"/" withString:@":"];
// Slashes in path were converted to colons, convert back
NSString *convertedPath = [path stringByReplacingOccurrencesOfString:@":" withString:@"/"];
url = [NSURL URLWithString:[NSString stringWithFormat:@"%@://%@/%@", method, convertedServer, convertedPath]];
}
}
if(url) {
[urls addObject:url];
}
}
NSDictionary *loadEntriesData = @{ @"entries": urls,
@"sort": @(YES),
@"origin": @(URLOriginExternal) };
[playlistController performSelectorInBackground:@selector(addURLsInBackground:) withObject:loadEntriesData];
[playlistLoader willInsertURLs:urls origin:URLOriginExternal];
[playlistLoader didInsertURLs:[playlistLoader addURLs:urls sort:YES] origin:URLOriginExternal];
[theApplication replyToOpenOrPrint:NSApplicationDelegateReplySuccess];
}
- (IBAction)openLiberapayPage:(id)sender {
[[NSWorkspace sharedWorkspace] openURL:[NSURL URLWithString:@"https://liberapay.com/kode54"]];
}
- (IBAction)openPaypalPage:(id)sender {
[[NSWorkspace sharedWorkspace] openURL:[NSURL URLWithString:@"https://www.paypal.com/paypalme/kode54"]];
}
- (IBAction)openKofiPage:(id)sender {
[[NSWorkspace sharedWorkspace] openURL:[NSURL URLWithString:@"https://ko-fi.com/kode54"]];
}
- (IBAction)openPatreonPage:(id)sender {
[[NSWorkspace sharedWorkspace] openURL:[NSURL URLWithString:@"https://www.patreon.com/kode54"]];
}
- (IBAction)privacyPolicy:(id)sender {
[[NSWorkspace sharedWorkspace] openURL:[NSURL URLWithString:NSLocalizedString(@"PrivacyPolicyURL", @"Privacy policy URL from Iubenda.")]];
[[NSWorkspace sharedWorkspace] openURL:[NSURL URLWithString:@"https://www.iubenda.com/privacy-policy/59859310"]];
}
- (IBAction)feedback:(id)sender {
@ -620,14 +488,14 @@ static BOOL consentLastEnabled = NO;
NSMutableDictionary *userDefaultsValuesDict = [NSMutableDictionary dictionary];
// Font defaults
float fFontSize = [NSFont systemFontSizeForControlSize:NSControlSizeRegular];
float fFontSize = [NSFont systemFontSizeForControlSize:NSControlSizeSmall];
NSNumber *fontSize = @(fFontSize);
[userDefaultsValuesDict setObject:fontSize forKey:@"fontSize"];
NSString *feedURLdefault = @"https://cogcdn.cog.losno.co/mercury.xml";
[userDefaultsValuesDict setObject:feedURLdefault forKey:@"SUFeedURL"];
[userDefaultsValuesDict setObject:@"enqueueAndPlay" forKey:@"openingFilesBehavior"];
[userDefaultsValuesDict setObject:@"clearAndPlay" forKey:@"openingFilesBehavior"];
[userDefaultsValuesDict setObject:@"enqueue" forKey:@"openingFilesAlteredBehavior"];
[userDefaultsValuesDict setObject:@"albumGainWithPeak" forKey:@"volumeScaling"];
@ -636,7 +504,7 @@ static BOOL consentLastEnabled = NO;
[userDefaultsValuesDict setObject:@(CogStatusStopped) forKey:@"lastPlaybackStatus"];
[userDefaultsValuesDict setObject:@"BASSMIDI" forKey:@"midiPlugin"];
[userDefaultsValuesDict setObject:@"dls appl" forKey:@"midiPlugin"];
[userDefaultsValuesDict setObject:@"default" forKey:@"midi.flavor"];
@ -652,18 +520,9 @@ static BOOL consentLastEnabled = NO;
NSData *barColor = [colorToValueTransformer reverseTransformedValue:[NSColor colorWithSRGBRed:1.0 green:0.5 blue:0 alpha:1.0]];
NSData *dotColor = [colorToValueTransformer reverseTransformedValue:[NSColor systemRedColor]];
[userDefaultsValuesDict setObject:@(YES) forKey:@"spectrumSceneKit"];
[userDefaultsValuesDict setObject:barColor forKey:@"spectrumBarColor"];
[userDefaultsValuesDict setObject:dotColor forKey:@"spectrumDotColor"];
[userDefaultsValuesDict setObject:@(150.0) forKey:@"synthDefaultSeconds"];
[userDefaultsValuesDict setObject:@(8.0) forKey:@"synthDefaultFadeSeconds"];
[userDefaultsValuesDict setObject:@(2) forKey:@"synthDefaultLoopCount"];
[userDefaultsValuesDict setObject:@(44100) forKey:@"synthSampleRate"];
[userDefaultsValuesDict setObject:@NO forKey:@"alwaysStopAfterCurrent"];
[userDefaultsValuesDict setObject:@YES forKey:@"selectionFollowsPlayback"];
// Register and sync defaults
[[NSUserDefaults standardUserDefaults] registerDefaults:userDefaultsValuesDict];
[[NSUserDefaults standardUserDefaults] synchronize];
@ -692,100 +551,9 @@ static BOOL consentLastEnabled = NO;
if([[[NSUserDefaults standardUserDefaults] stringForKey:@"midiPlugin"] isEqualToString:@"FluidSynth"]) {
[[NSUserDefaults standardUserDefaults] setValue:@"BASSMIDI" forKey:@"midiPlugin"];
}
NSString *midiPlugin = [[NSUserDefaults standardUserDefaults] stringForKey:@"midiPlugin"];
if([midiPlugin length] == 8 && [[midiPlugin substringFromIndex:4] isEqualToString:@"appl"]) {
[[NSUserDefaults standardUserDefaults] setObject:@"BASSMIDI" forKey:@"midiPlugin"];
}
}
MASShortcut *shortcutWithMigration(NSString *oldKeyCodePrefName,
NSString *oldKeyModifierPrefName,
NSString *newShortcutPrefName,
NSInteger newDefaultKeyCode) {
NSEventModifierFlags defaultModifiers = NSEventModifierFlagControl | NSEventModifierFlagCommand;
NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults];
if([defaults objectForKey:oldKeyCodePrefName]) {
NSInteger oldKeyCode = [defaults integerForKey:oldKeyCodePrefName];
NSEventModifierFlags oldKeyModifiers = [defaults integerForKey:oldKeyModifierPrefName];
// Should we consider temporarily save these values for further migration?
[defaults removeObjectForKey:oldKeyCodePrefName];
[defaults removeObjectForKey:oldKeyModifierPrefName];
return [MASShortcut shortcutWithKeyCode:oldKeyCode modifierFlags:oldKeyModifiers];
} else {
return [MASShortcut shortcutWithKeyCode:newDefaultKeyCode modifierFlags:defaultModifiers];
}
}
static NSDictionary *shortcutDefaults = nil;
- (void)registerDefaultHotKeys {
MASShortcut *playShortcut = shortcutWithMigration(@"hotKeyPlayKeyCode",
@"hotKeyPlayModifiers",
CogPlayShortcutKey,
kVK_ANSI_P);
MASShortcut *nextShortcut = shortcutWithMigration(@"hotKeyNextKeyCode",
@"hotKeyNextModifiers",
CogNextShortcutKey,
kVK_ANSI_N);
MASShortcut *prevShortcut = shortcutWithMigration(@"hotKeyPreviousKeyCode",
@"hotKeyPreviousModifiers",
CogPrevShortcutKey,
kVK_ANSI_R);
MASShortcut *spamShortcut = [MASShortcut shortcutWithKeyCode:kVK_ANSI_C
modifierFlags:NSEventModifierFlagControl | NSEventModifierFlagCommand];
MASShortcut *fadeShortcut = [MASShortcut shortcutWithKeyCode:kVK_ANSI_O
modifierFlags:NSEventModifierFlagControl | NSEventModifierFlagCommand];
MASShortcut *seekBkwdShortcut = [MASShortcut shortcutWithKeyCode:kVK_LeftArrow
modifierFlags:NSEventModifierFlagControl | NSEventModifierFlagCommand];
MASShortcut *seekFwdShortcut = [MASShortcut shortcutWithKeyCode:kVK_RightArrow
modifierFlags:NSEventModifierFlagControl | NSEventModifierFlagCommand];
MASDictionaryTransformer *transformer = [MASDictionaryTransformer new];
NSDictionary *playShortcutDict = [transformer reverseTransformedValue:playShortcut];
NSDictionary *nextShortcutDict = [transformer reverseTransformedValue:nextShortcut];
NSDictionary *prevShortcutDict = [transformer reverseTransformedValue:prevShortcut];
NSDictionary *spamShortcutDict = [transformer reverseTransformedValue:spamShortcut];
NSDictionary *fadeShortcutDict = [transformer reverseTransformedValue:fadeShortcut];
NSDictionary *seekBkwdShortcutDict = [transformer reverseTransformedValue:seekBkwdShortcut];
NSDictionary *seekFwdShortcutDict = [transformer reverseTransformedValue:seekFwdShortcut];
// Register default values to be used for the first app start
NSDictionary<NSString *, NSDictionary *> *defaultShortcuts = @{
CogPlayShortcutKey: playShortcutDict,
CogNextShortcutKey: nextShortcutDict,
CogPrevShortcutKey: prevShortcutDict,
CogSpamShortcutKey: spamShortcutDict,
CogFadeShortcutKey: fadeShortcutDict,
CogSeekBackwardShortcutKey: seekBkwdShortcutDict,
CogSeekForwardShortcutKey: seekFwdShortcutDict
};
shortcutDefaults = defaultShortcuts;
[[NSUserDefaults standardUserDefaults] registerDefaults:defaultShortcuts];
}
- (IBAction)resetHotkeys:(id)sender {
[shortcutDefaults enumerateKeysAndObjectsUsingBlock:^(id _Nonnull key, id _Nonnull obj, BOOL * _Nonnull stop) {
[[NSUserDefaults standardUserDefaults] setObject:obj forKey:key];
}];
}
- (void)migrateHotKeys {
NSArray *inKeys = @[CogPlayShortcutKeyV1, CogNextShortcutKeyV1, CogPrevShortcutKeyV1, CogSpamShortcutKeyV1, CogFadeShortcutKeyV1, CogSeekBackwardShortcutKeyV1, CogSeekForwardShortcutKeyV1];
NSArray *outKeys = @[CogPlayShortcutKey, CogNextShortcutKey, CogPrevShortcutKey, CogSpamShortcutKey, CogFadeShortcutKey, CogSeekBackwardShortcutKey, CogSeekForwardShortcutKey];
for(size_t i = 0, j = [inKeys count]; i < j; ++i) {
NSString *inKey = inKeys[i];
NSString *outKey = outKeys[i];
id value = [[NSUserDefaults standardUserDefaults] objectForKey:inKey];
if(value && value != [NSNull null]) {
[[NSUserDefaults standardUserDefaults] setObject:value forKey:outKey];
[[NSUserDefaults standardUserDefaults] removeObjectForKey:inKey];
}
}
}
/* Unassign previous handler first, so dealloc can unregister it from the global map before the new instances are assigned */
- (void)registerHotKeys {
MASShortcutBinder *binder = [MASShortcutBinder sharedBinder];
[binder bindShortcutWithDefaultsKey:CogPlayShortcutKey
@ -807,21 +575,6 @@ static NSDictionary *shortcutDefaults = nil;
toAction:^{
[self clickSpam];
}];
[binder bindShortcutWithDefaultsKey:CogFadeShortcutKey
toAction:^{
[self clickFade];
}];
[binder bindShortcutWithDefaultsKey:CogSeekBackwardShortcutKey
toAction:^{
[self clickSeekBack];
}];
[binder bindShortcutWithDefaultsKey:CogSeekForwardShortcutKey
toAction:^{
[self clickSeekForward];
}];
}
- (void)clickPlay {
@ -848,22 +601,10 @@ static NSDictionary *shortcutDefaults = nil;
[playbackController spam:nil];
}
- (void)clickFade {
[playbackController fade:nil];
}
- (void)clickSeek:(NSTimeInterval)position {
[playbackController seek:self toTime:position];
}
- (void)clickSeekBack {
[playbackController seekBackward:10.0];
}
- (void)clickSeekForward {
[playbackController seekForward:10.0];
}
- (void)changeFontSize:(float)size {
NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults];
float fCurrentSize = [defaults floatForKey:@"fontSize"];
@ -933,7 +674,7 @@ static NSDictionary *shortcutDefaults = nil;
BOOL hideItem = NO;
if([[notification name] isEqualToString:CogPlaybackDidStopNotificiation] || !pe || ![pe artist] || [[pe artist] isEqualToString:@""])
if([[notification name] isEqualToString:CogPlaybackDidStopNotficiation] || !pe || ![pe artist] || [[pe artist] isEqualToString:@""])
hideItem = YES;
if(hideItem && [dockMenu indexOfItem:currentArtistItem] == 0) {
@ -959,21 +700,4 @@ static NSDictionary *shortcutDefaults = nil;
[kAppController showPathSuggester];
}
- (void)showRubberbandSettings:(id)sender {
[preferencesController showRubberbandSettings:sender];
}
+ (void)globalShowRubberbandSettings {
[kAppController showRubberbandSettings:kAppController];
}
- (void)selectTrack:(id)sender {
PlaylistEntry *pe = (PlaylistEntry *)sender;
@try {
[playlistView selectRowIndexes:[NSIndexSet indexSetWithIndex:pe.index] byExtendingSelection:NO];
}
@catch(NSException *e) {
}
}
@end

View file

@ -13,13 +13,6 @@
@interface DockIconController : NSObject {
NSImage *dockImage;
NSInteger lastDockCustom;
NSInteger lastDockCustomPlaque;
NSInteger dockCustomLoaded;
NSImage *dockCustomStop;
NSImage *dockCustomPlay;
NSImage *dockCustomPause;
IBOutlet PlaybackController *playbackController;
NSInteger lastPlaybackStatus;

View file

@ -14,24 +14,16 @@
static NSString *DockIconPlaybackStatusObservationContext = @"DockIconPlaybackStatusObservationContext";
static NSString *CogCustomDockIconsReloadNotification = @"CogCustomDockIconsReloadNotification";
- (void)startObserving {
[playbackController addObserver:self forKeyPath:@"playbackStatus" options:(NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial) context:(__bridge void *_Nullable)(DockIconPlaybackStatusObservationContext)];
[playbackController addObserver:self forKeyPath:@"progressOverall" options:(NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionOld) context:(__bridge void *_Nullable)(DockIconPlaybackStatusObservationContext)];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.colorfulDockIcons" options:0 context:(__bridge void *_Nullable)(DockIconPlaybackStatusObservationContext)];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.customDockIcons" options:0 context:(__bridge void *_Nullable)(DockIconPlaybackStatusObservationContext)];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.customDockIconsPlaque" options:0 context:(__bridge void *_Nullable)(DockIconPlaybackStatusObservationContext)];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(refreshDockIcons:) name:CogCustomDockIconsReloadNotification object:nil];
}
- (void)stopObserving {
[playbackController removeObserver:self forKeyPath:@"playbackStatus" context:(__bridge void *_Nullable)(DockIconPlaybackStatusObservationContext)];
[playbackController removeObserver:self forKeyPath:@"progressOverall" context:(__bridge void *_Nullable)(DockIconPlaybackStatusObservationContext)];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.colorfulDockIcons" context:(__bridge void *_Nullable)(DockIconPlaybackStatusObservationContext)];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.customDockIcons" context:(__bridge void *_Nullable)(DockIconPlaybackStatusObservationContext)];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.customDockIconsPlaque" context:(__bridge void *_Nullable)(DockIconPlaybackStatusObservationContext)];
[[NSNotificationCenter defaultCenter] removeObserver:self name:CogCustomDockIconsReloadNotification object:nil];
}
- (void)startObservingProgress:(NSProgress *)progress {
@ -50,34 +42,6 @@ static NSString *getBadgeName(NSString *baseName, BOOL colorfulIcons) {
}
}
static NSString *getCustomIconName(NSString *baseName) {
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSApplicationSupportDirectory, NSUserDomainMask, YES);
NSString *basePath = [[paths firstObject] stringByAppendingPathComponent:@"Cog"];
basePath = [basePath stringByAppendingPathComponent:@"Icons"];
basePath = [basePath stringByAppendingPathComponent:baseName];
return [basePath stringByAppendingPathExtension:@"png"];
}
- (BOOL)loadCustomDockIcons {
NSError *error = nil;
NSData *dataStopIcon = [NSData dataWithContentsOfFile:getCustomIconName(@"Stop") options:(NSDataReadingMappedIfSafe) error:&error];
if(!dataStopIcon || error) {
return NO;
}
NSData *dataPlayIcon = [NSData dataWithContentsOfFile:getCustomIconName(@"Play") options:(NSDataReadingMappedIfSafe) error:&error];
if(!dataPlayIcon || error) {
return NO;
}
NSData *dataPauseIcon = [NSData dataWithContentsOfFile:getCustomIconName(@"Pause") options:(NSDataReadingMappedIfSafe) error:&error];
if(!dataPauseIcon || error) {
return NO;
}
dockCustomStop = [[NSImage alloc] initWithData:dataStopIcon];
dockCustomPlay = [[NSImage alloc] initWithData:dataPlayIcon];
dockCustomPause = [[NSImage alloc] initWithData:dataPauseIcon];
return (dockCustomStop && dockCustomPlay && dockCustomPause);
}
- (void)refreshDockIcon:(NSInteger)playbackStatus withProgress:(double)progressStatus {
// Really weird crash user experienced because the plaque image didn't load?
if(!dockImage || dockImage.size.width == 0 || dockImage.size.height == 0) return;
@ -86,30 +50,6 @@ static NSString *getCustomIconName(NSString *baseName) {
BOOL drawIcon = NO;
BOOL removeProgress = NO;
BOOL useCustomDockIcons = [[NSUserDefaults standardUserDefaults] boolForKey:@"customDockIcons"];
BOOL useCustomDockIconsPlaque = [[NSUserDefaults standardUserDefaults] boolForKey:@"customDockIconsPlaque"];
if(useCustomDockIcons && !dockCustomLoaded) {
dockCustomLoaded = [self loadCustomDockIcons];
if(!dockCustomLoaded) {
useCustomDockIcons = NO;
}
}
if(useCustomDockIcons != lastDockCustom ||
useCustomDockIconsPlaque != lastDockCustomPlaque) {
lastDockCustom = useCustomDockIcons;
lastDockCustomPlaque = useCustomDockIconsPlaque;
drawIcon = YES;
if(!useCustomDockIcons) {
dockCustomLoaded = NO;
dockCustomStop = nil;
dockCustomPlay = nil;
dockCustomPause = nil;
}
}
if(playbackStatus < 0)
playbackStatus = lastPlaybackStatus;
else {
@ -142,20 +82,20 @@ static NSString *getCustomIconName(NSString *baseName) {
if(drawIcon) {
switch(playbackStatus) {
case CogStatusPlaying:
badgeImage = useCustomDockIcons ? dockCustomPlay : [NSImage imageNamed:getBadgeName(@"Play", colorfulIcons)];
badgeImage = [NSImage imageNamed:getBadgeName(@"Play", colorfulIcons)];
break;
case CogStatusPaused:
badgeImage = useCustomDockIcons ? dockCustomPause : [NSImage imageNamed:getBadgeName(@"Pause", colorfulIcons)];
badgeImage = [NSImage imageNamed:getBadgeName(@"Pause", colorfulIcons)];
break;
default:
badgeImage = useCustomDockIcons ? dockCustomStop : [NSImage imageNamed:getBadgeName(@"Stop", colorfulIcons)];
badgeImage = [NSImage imageNamed:getBadgeName(@"Stop", colorfulIcons)];
break;
}
NSSize badgeSize = [badgeImage size];
NSImage *newDockImage = (useCustomDockIcons && !useCustomDockIconsPlaque) ? [[NSImage alloc] initWithSize:NSMakeSize(1024, 1024)] : [dockImage copy];
NSImage *newDockImage = [dockImage copy];
[newDockImage lockFocus];
[badgeImage drawInRect:NSMakeRect(0, 0, 1024, 1024)
@ -170,7 +110,7 @@ static NSString *getCustomIconName(NSString *baseName) {
[dockTile setContentView:imageView];
progressIndicator = [[NSProgressIndicator alloc] initWithFrame:NSMakeRect(0.0, 0.0, dockTile.size.width, 10.0)];
[progressIndicator setStyle:NSProgressIndicatorStyleBar];
[progressIndicator setStyle:NSProgressIndicatorBarStyle];
[progressIndicator setIndeterminate:NO];
[progressIndicator setBezeled:YES];
[progressIndicator setMinValue:0];
@ -246,9 +186,7 @@ static NSString *getCustomIconName(NSString *baseName) {
}
[self refreshDockIcon:-1 withProgress:progressStatus];
} else if([keyPath isEqualToString:@"values.colorfulDockIcons"] ||
[keyPath isEqualToString:@"values.customDockIcons"] ||
[keyPath isEqualToString:@"values.customDockIconsPlaque"]) {
} else if([keyPath isEqualToString:@"values.colorfulDockIcons"]) {
[self refreshDockIcon:-1 withProgress:-10];
} else if([keyPath isEqualToString:@"fractionCompleted"]) {
double progressStatus = [(NSProgress *)object fractionCompleted];
@ -259,12 +197,6 @@ static NSString *getCustomIconName(NSString *baseName) {
}
}
- (void)refreshDockIcons:(NSNotification *)notification {
lastDockCustom = NO;
dockCustomLoaded = NO;
[self refreshDockIcon:-1 withProgress:-10];
}
- (void)awakeFromNib {
dockImage = [[NSImage imageNamed:@"Plaque"] copy];
lastColorfulStatus = -1;

View file

@ -19,16 +19,10 @@
#define DEFAULT_VOLUME_DOWN 5
#define DEFAULT_VOLUME_UP DEFAULT_VOLUME_DOWN
#define DEFAULT_PITCH_DOWN 0.2
#define DEFAULT_PITCH_UP DEFAULT_PITCH_DOWN
#define DEFAULT_TEMPO_DOWN 0.2
#define DEFAULT_TEMPO_UP DEFAULT_TEMPO_DOWN
extern NSString *CogPlaybackDidBeginNotificiation;
extern NSString *CogPlaybackDidPauseNotificiation;
extern NSString *CogPlaybackDidResumeNotificiation;
extern NSString *CogPlaybackDidStopNotificiation;
extern NSString *CogPlaybackDidBeginNotficiation;
extern NSString *CogPlaybackDidPauseNotficiation;
extern NSString *CogPlaybackDidResumeNotficiation;
extern NSString *CogPlaybackDidStopNotficiation;
extern NSDictionary *makeRGInfo(PlaylistEntry *pe);
@ -46,9 +40,6 @@ extern NSDictionary *makeRGInfo(PlaylistEntry *pe);
IBOutlet EqualizerWindowController *equalizerWindowController;
IBOutlet NSSlider *volumeSlider;
IBOutlet NSSlider *pitchSlider;
IBOutlet NSSlider *tempoSlider;
IBOutlet NSButton *lockButton;
IBOutlet NSArrayController *outputDevices;
@ -78,14 +69,6 @@ extern NSDictionary *makeRGInfo(PlaylistEntry *pe);
- (IBAction)volumeDown:(id)sender;
- (IBAction)volumeUp:(id)sender;
- (IBAction)changePitch:(id)sender;
- (IBAction)pitchDown:(id)sender;
- (IBAction)pitchUp:(id)sender;
- (IBAction)changeTempo:(id)sender;
- (IBAction)tempoDown:(id)sender;
- (IBAction)tempoUp:(id)sender;
- (IBAction)playPauseResume:(id)sender;
- (IBAction)pauseResume:(id)sender;
- (IBAction)skipToNextAlbum:(id)sender;

View file

@ -19,76 +19,18 @@
#import "Logging.h"
@import Sentry;
// Sentry captureMessage is too spammy to use for anything but actual errors
@import Firebase;
extern BOOL kAppControllerShuttingDown;
@implementation NSObject (NxAdditions)
#if 0
-(void)performSelectorInBackground:(SEL)selector withObjects:(id)object, ...
{
NSMethodSignature *signature = [self methodSignatureForSelector:selector];
// Setup the invocation
NSInvocation *invocation = [NSInvocation invocationWithMethodSignature:signature];
invocation.target = self;
invocation.selector = selector;
// Associate the arguments
va_list objects;
va_start(objects, object);
unsigned int objectCounter = 2;
for (id obj = object; obj != nil; obj = va_arg(objects, id))
{
[invocation setArgument:&obj atIndex:objectCounter++];
}
va_end(objects);
// Make sure to invoke on a background queue
NSInvocationOperation *operation = [[NSInvocationOperation alloc] initWithInvocation:invocation];
NSOperationQueue *backgroundQueue = [[NSOperationQueue alloc] init];
[backgroundQueue addOperation:operation];
}
#endif
-(void)performSelectorOnMainThread:(SEL)selector withObjects:(id)object, ...
{
NSMethodSignature *signature = [self methodSignatureForSelector:selector];
// Setup the invocation
NSInvocation *invocation = [NSInvocation invocationWithMethodSignature:signature];
invocation.target = self;
invocation.selector = selector;
// Associate the arguments
va_list objects;
va_start(objects, object);
unsigned int objectCounter = 2;
for (id obj = object; obj != nil; obj = va_arg(objects, id))
{
[invocation setArgument:&obj atIndex:objectCounter++];
}
va_end(objects);
// Invoke on the main operation queue
NSInvocationOperation *operation = [[NSInvocationOperation alloc] initWithInvocation:invocation];
NSOperationQueue *mainQueue = [NSOperationQueue mainQueue];
[mainQueue addOperation:operation];
}
@end
@implementation PlaybackController
#define DEFAULT_SEEK 5
NSString *CogPlaybackDidBeginNotificiation = @"CogPlaybackDidBeginNotificiation";
NSString *CogPlaybackDidPauseNotificiation = @"CogPlaybackDidPauseNotificiation";
NSString *CogPlaybackDidResumeNotificiation = @"CogPlaybackDidResumeNotificiation";
NSString *CogPlaybackDidStopNotificiation = @"CogPlaybackDidStopNotificiation";
NSString *CogPlaybackDidBeginNotficiation = @"CogPlaybackDidBeginNotficiation";
NSString *CogPlaybackDidPauseNotficiation = @"CogPlaybackDidPauseNotficiation";
NSString *CogPlaybackDidResumeNotficiation = @"CogPlaybackDidResumeNotficiation";
NSString *CogPlaybackDidStopNotficiation = @"CogPlaybackDidStopNotficiation";
@synthesize playbackStatus;
@ -120,51 +62,15 @@ NSString *CogPlaybackDidStopNotificiation = @"CogPlaybackDidStopNotificiation";
- (void)initDefaults {
NSDictionary *defaultsDictionary = @{ @"volume": @(75.0),
@"pitch": @(1.0),
@"tempo": @(1.0),
@"speedLock": @(YES),
@"GraphicEQenable": @(NO),
@"GraphicEQpreset": @(-1),
@"GraphicEQtrackgenre": @(NO),
@"volumeLimit": @(YES),
@"enableHrtf": @(NO),
@"enableHeadTracking": @(NO),
@"enableHDCD": @(NO),
/*@"rubberbandEngine": @"faster",*/
@"rubberbandTransients": @"crisp",
@"rubberbandDetector": @"compound",
@"rubberbandPhase": @"laminar",
@"rubberbandWindow": @"standard",
@"rubberbandSmoothing": @"off",
@"rubberbandFormant": @"shifted",
@"rubberbandPitch": @"highspeed",
@"rubberbandChannels": @"apart"
};
@"headphoneVirtualization": @(NO) };
[[NSUserDefaults standardUserDefaults] registerDefaults:defaultsDictionary];
}
static double speedScale(double input, double min, double max) {
input = (input - min) * 100.0 / (max - min);
return ((input * input) * (5.0 - 0.2) / 10000.0) + 0.2;
}
static double reverseSpeedScale(double input, double min, double max) {
input = sqrtf((input - 0.2) * 10000.0 / (5.0 - 0.2));
return (input * (max - min) / 100.0) + min;
}
- (void)snapSpeeds {
double pitch = [[NSUserDefaults standardUserDefaults] doubleForKey:@"pitch"];
double tempo = [[NSUserDefaults standardUserDefaults] doubleForKey:@"tempo"];
if(fabs(pitch - 1.0) < 1e-6) {
[[NSUserDefaults standardUserDefaults] setDouble:1.0 forKey:@"pitch"];
}
if(fabs(tempo - 1.0) < 1e-6) {
[[NSUserDefaults standardUserDefaults] setDouble:1.0 forKey:@"tempo"];
}
}
- (void)awakeFromNib {
BOOL volumeLimit = [[[NSUserDefaultsController sharedUserDefaultsController] defaults] boolForKey:@"volumeLimit"];
const double MAX_VOLUME = (volumeLimit) ? 100.0 : 800.0;
@ -174,16 +80,6 @@ static double reverseSpeedScale(double input, double min, double max) {
[volumeSlider setDoubleValue:logarithmicToLinear(volume, MAX_VOLUME)];
[audioPlayer setVolume:volume];
double pitch = [[NSUserDefaults standardUserDefaults] doubleForKey:@"pitch"];
[pitchSlider setDoubleValue:reverseSpeedScale(pitch, [pitchSlider minValue], [pitchSlider maxValue])];
double tempo = [[NSUserDefaults standardUserDefaults] doubleForKey:@"tempo"];
[tempoSlider setDoubleValue:reverseSpeedScale(tempo, [tempoSlider minValue], [tempoSlider maxValue])];
[self snapSpeeds];
BOOL speedLock = [[NSUserDefaults standardUserDefaults] boolForKey:@"speedLock"];
[lockButton setTitle:speedLock ? @"🔒" : @"🔓"];
[self setSeekable:NO];
}
@ -205,11 +101,6 @@ static double reverseSpeedScale(double input, double min, double max) {
}
- (IBAction)pause:(id)sender {
if(![self seekable]) {
[self stop:sender];
return;
}
[[NSUserDefaults standardUserDefaults] setInteger:CogStatusPaused forKey:@"lastPlaybackStatus"];
[audioPlayer pause];
@ -301,11 +192,11 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
if(!pe.url) {
pe.error = YES;
pe.errorMessage = NSLocalizedStringFromTableInBundle(@"ErrorMessageBadFile", nil, [NSBundle bundleForClass:[self class]], @"");
[SentrySDK captureMessage:@"Attempted to play a bad file with no URL"];
[[FIRCrashlytics crashlytics] log:@"Attempting to play bad file."];
return;
}
//[SentrySDK captureMessage:[NSString stringWithFormat:@"Playing track: %@", pe.url]];
[[FIRCrashlytics crashlytics] logWithFormat:@"Playing track: %@", pe.url];
DLog(@"PLAYLIST CONTROLLER: %@", [playlistController class]);
[playlistController setCurrentEntry:pe];
@ -325,7 +216,7 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
#if 0
// Race here, but the worst that could happen is we re-read the data
if([pe metadataLoaded] != YES) {
if ([pe metadataLoaded] != YES) {
[pe performSelectorOnMainThread:@selector(setMetadata:) withObject:[playlistLoader readEntryInfo:pe] waitUntilDone:YES];
}
#elif 0
@ -346,7 +237,7 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
double seekTime = pe.seekable ? [offset doubleValue] : 0.0;
[audioPlayer performSelectorOnMainThread:@selector(playBG:withUserInfo:withRGInfo:startPaused:andSeekTo:) withObjects:pe.url, pe, makeRGInfo(pe), @(paused), @(seekTime), nil];
[audioPlayer play:pe.url withUserInfo:pe withRGInfo:makeRGInfo(pe) startPaused:paused andSeekTo:seekTime];
}
- (IBAction)next:(id)sender {
@ -357,12 +248,8 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
}
- (IBAction)prev:(id)sender {
double pos = [audioPlayer amountPlayed];
if(pos < 5.0) {
if([playlistController prev] == NO)
return;
}
[self playEntry:[playlistController currentEntry]];
}
@ -381,7 +268,7 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
- (IBAction)seek:(id)sender {
double time = [sender doubleValue];
[audioPlayer performSelectorOnMainThread:@selector(seekToTimeBG:) withObjects:@(time), nil];
[audioPlayer seekToTime:time];
lastPosition = -10;
@ -398,7 +285,7 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
lastPosition = -10;
[audioPlayer performSelectorOnMainThread:@selector(seekToTimeBG:) withObjects:@(time), nil];
[audioPlayer seekToTime:time];
[self setPosition:time];
@ -424,17 +311,15 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
}
- (void)seekForward:(double)amount {
if([self seekable]) {
double seekTo = [audioPlayer amountPlayed] + amount;
if(seekTo > [[[playlistController currentEntry] length] doubleValue]) {
[self next:self];
} else {
lastPosition = -10;
[audioPlayer performSelectorOnMainThread:@selector(seekToTimeBG:) withObjects:@(seekTo), nil];
[audioPlayer seekToTime:seekTo];
[self setPosition:seekTo];
}
}
}
- (IBAction)eventSeekBackward:(id)sender {
@ -442,7 +327,6 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
}
- (void)seekBackward:(double)amount {
if([self seekable]) {
double seekTo = [audioPlayer amountPlayed] - amount;
if(seekTo < 0)
@ -450,9 +334,8 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
lastPosition = -10;
[audioPlayer performSelectorOnMainThread:@selector(seekToTimeBG:) withObjects:@(seekTo), nil];
[audioPlayer seekToTime:seekTo];
[self setPosition:seekTo];
}
}
/*
@ -461,7 +344,7 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
NSImage *img = [NSImage imageNamed:name];
// [img retain];
if(img == nil)
if (img == nil)
{
DLog(@"Error loading image!");
}
@ -562,32 +445,6 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
}
}
- (IBAction)changePitch:(id)sender {
const double pitch = speedScale([sender doubleValue], [pitchSlider minValue], [pitchSlider maxValue]);
DLog(@"PITCH: %lf", pitch);
[[NSUserDefaults standardUserDefaults] setDouble:pitch forKey:@"pitch"];
if([[NSUserDefaults standardUserDefaults] boolForKey:@"speedLock"]) {
[[NSUserDefaults standardUserDefaults] setDouble:pitch forKey:@"tempo"];
}
[self snapSpeeds];
}
- (IBAction)changeTempo:(id)sender {
const double tempo = speedScale([sender doubleValue], [tempoSlider minValue], [tempoSlider maxValue]);
DLog(@"TEMPO: %lf", tempo);
[[NSUserDefaults standardUserDefaults] setDouble:tempo forKey:@"tempo"];
if([[NSUserDefaults standardUserDefaults] boolForKey:@"speedLock"]) {
[[NSUserDefaults standardUserDefaults] setDouble:tempo forKey:@"pitch"];
}
[self snapSpeeds];
}
- (IBAction)skipToNextAlbum:(id)sender {
BOOL found = NO;
@ -689,75 +546,8 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
[[NSUserDefaults standardUserDefaults] setDouble:[audioPlayer volume] forKey:@"volume"];
}
- (IBAction)pitchDown:(id)sender {
double pitch = speedScale([pitchSlider doubleValue], [pitchSlider minValue], [pitchSlider maxValue]);
double newPitch = pitch - DEFAULT_PITCH_DOWN;
if(newPitch < 0.2) {
newPitch = 0.2;
}
[pitchSlider setDoubleValue:reverseSpeedScale(newPitch, [pitchSlider minValue], [pitchSlider maxValue])];
[[NSUserDefaults standardUserDefaults] setDouble:newPitch forKey:@"pitch"];
if([[NSUserDefaults standardUserDefaults] boolForKey:@"speedLock"]) {
[tempoSlider setDoubleValue:reverseSpeedScale(newPitch, [tempoSlider minValue], [tempoSlider maxValue])];
[[NSUserDefaults standardUserDefaults] setDouble:newPitch forKey:@"tempo"];
}
}
- (IBAction)pitchUp:(id)sender {
double pitch = speedScale([pitchSlider doubleValue], [pitchSlider minValue], [pitchSlider maxValue]);
double newPitch = pitch + DEFAULT_PITCH_UP;
if(newPitch > 5.0) {
newPitch = 5.0;
}
[pitchSlider setDoubleValue:reverseSpeedScale(newPitch, [pitchSlider minValue], [pitchSlider maxValue])];
[[NSUserDefaults standardUserDefaults] setDouble:newPitch forKey:@"pitch"];
if([[NSUserDefaults standardUserDefaults] boolForKey:@"speedLock"]) {
[tempoSlider setDoubleValue:reverseSpeedScale(newPitch, [tempoSlider minValue], [tempoSlider maxValue])];
[[NSUserDefaults standardUserDefaults] setDouble:newPitch forKey:@"tempo"];
}
}
- (IBAction)tempoDown:(id)sender {
double tempo = speedScale([tempoSlider doubleValue], [tempoSlider minValue], [tempoSlider maxValue]);
double newTempo = tempo - DEFAULT_TEMPO_DOWN;
if(newTempo < 0.2) {
newTempo = 0.2;
}
[tempoSlider setDoubleValue:reverseSpeedScale(newTempo, [tempoSlider minValue], [tempoSlider maxValue])];
[[NSUserDefaults standardUserDefaults] setDouble:newTempo forKey:@"tempo"];
if([[NSUserDefaults standardUserDefaults] boolForKey:@"speedLock"]) {
[pitchSlider setDoubleValue:reverseSpeedScale(newTempo, [pitchSlider minValue], [pitchSlider maxValue])];
[[NSUserDefaults standardUserDefaults] setDouble:newTempo forKey:@"pitch"];
}
}
- (IBAction)tempoUp:(id)sender {
double tempo = speedScale([tempoSlider doubleValue], [tempoSlider minValue], [tempoSlider maxValue]);
double newTempo = tempo + DEFAULT_TEMPO_UP;
if(newTempo > 5.0) {
newTempo = 5.0;
}
[tempoSlider setDoubleValue:reverseSpeedScale(newTempo, [tempoSlider minValue], [tempoSlider maxValue])];
[[NSUserDefaults standardUserDefaults] setDouble:newTempo forKey:@"tempo"];
if([[NSUserDefaults standardUserDefaults] boolForKey:@"speedLock"]) {
[pitchSlider setDoubleValue:reverseSpeedScale(newTempo, [pitchSlider minValue], [pitchSlider maxValue])];
[[NSUserDefaults standardUserDefaults] setDouble:newTempo forKey:@"pitch"];
}
}
- (void)audioPlayer:(AudioPlayer *)player displayEqualizer:(AudioUnit)eq {
if(_eq && _eq != eq) {
[equalizerWindowController setEQ:nil];
}
@ -775,6 +565,19 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
- (void)audioPlayer:(AudioPlayer *)player removeEqualizer:(AudioUnit)eq {
if(eq == _eq) {
OSStatus err;
CFPropertyListRef classData;
UInt32 size;
size = sizeof(classData);
err = AudioUnitGetProperty(eq, kAudioUnitProperty_ClassInfo, kAudioUnitScope_Global, 0, &classData, &size);
if(err == noErr) {
CFPreferencesSetAppValue(CFSTR("GraphEQ_Preset"), classData, kCFPreferencesCurrentApplication);
CFRelease(classData);
}
CFPreferencesAppSynchronize(kCFPreferencesCurrentApplication);
[equalizerWindowController setEQ:nil];
_eq = nil;
@ -796,15 +599,15 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
}
if(pe && pe.url) {
//[SentrySDK captureMessage:[NSString stringWithFormat:@"Beginning decoding track: %@", pe.url]];
[[FIRCrashlytics crashlytics] logWithFormat:@"Beginning decoding track: %@", pe.url];
[player setNextStream:pe.url withUserInfo:pe withRGInfo:makeRGInfo(pe)];
} else if(pe) {
[SentrySDK captureMessage:@"Invalid playlist entry reached"];
[[FIRCrashlytics crashlytics] log:@"Invalid playlist entry reached."];
[player setNextStream:nil];
pe.error = YES;
pe.errorMessage = NSLocalizedStringFromTableInBundle(@"ErrorMessageBadFile", nil, [NSBundle bundleForClass:[self class]], @"");
} else {
//[SentrySDK captureMessage:@"End of playlist reached"];
[[FIRCrashlytics crashlytics] log:@"End of playlist reached."];
[player setNextStream:nil];
}
}
@ -815,7 +618,7 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
// Delay the action until this function has returned to the audio thread
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 5 * NSEC_PER_MSEC), dispatch_get_main_queue(), ^{
if(pe) {
//[SentrySDK captureMessage:[NSString stringWithFormat:@"Updating UI with track: %@", pe.url]];
[[FIRCrashlytics crashlytics] logWithFormat:@"Updating UI with track: %@", pe.url];
}
[self->playlistController setCurrentEntry:pe];
@ -832,7 +635,7 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
});
if(pe) {
[[NSNotificationCenter defaultCenter] postNotificationName:CogPlaybackDidBeginNotificiation object:pe];
[[NSNotificationCenter defaultCenter] postNotificationName:CogPlaybackDidBeginNotficiation object:pe];
}
}
@ -846,26 +649,26 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
}
if(status == CogStatusStopped) {
//[SentrySDK captureMessage:@"Playback stopped"];
[[FIRCrashlytics crashlytics] log:@"Stopped."];
[self setPosition:0];
[self setSeekable:NO]; // the player stopped, disable the slider
[[NSNotificationCenter defaultCenter] postNotificationName:CogPlaybackDidStopNotificiation object:nil];
[[NSNotificationCenter defaultCenter] postNotificationName:CogPlaybackDidStopNotficiation object:nil];
} else // paused
{
//[SentrySDK captureMessage:@"Playback paused"];
[[NSNotificationCenter defaultCenter] postNotificationName:CogPlaybackDidPauseNotificiation object:nil];
[[FIRCrashlytics crashlytics] log:@"Paused."];
[[NSNotificationCenter defaultCenter] postNotificationName:CogPlaybackDidPauseNotficiation object:nil];
}
} else if(status == CogStatusPlaying) {
//[SentrySDK captureMessage:@"Playback started"];
[[FIRCrashlytics crashlytics] log:@"Started playing."];
if(!positionTimer) {
positionTimer = [NSTimer timerWithTimeInterval:0.2 target:self selector:@selector(updatePosition:) userInfo:nil repeats:YES];
[[NSRunLoop currentRunLoop] addTimer:positionTimer forMode:NSRunLoopCommonModes];
}
[[NSNotificationCenter defaultCenter] postNotificationName:CogPlaybackDidResumeNotificiation object:nil];
[[NSNotificationCenter defaultCenter] postNotificationName:CogPlaybackDidResumeNotficiation object:nil];
}
if(status == CogStatusStopped) {
@ -887,6 +690,14 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
break;
}
if(status == CogStatusStopped) {
status = CogStatusStopping;
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 3 * NSEC_PER_SEC), dispatch_get_main_queue(), ^{
if([self playbackStatus] == CogStatusStopping)
[self setPlaybackStatus:CogStatusStopped];
});
}
[self setPlaybackStatus:status];
// If we don't send it here, if we've stopped, then the NPIC will be stuck at the last file we played.
[self sendMetaData];
@ -894,7 +705,7 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
- (void)audioPlayer:(AudioPlayer *)player didStopNaturally:(id)userInfo {
if([[NSUserDefaults standardUserDefaults] boolForKey:@"quitOnNaturalStop"]) {
//[SentrySDK captureMessage:@"Playback stopped naturally, terminating app"];
[[FIRCrashlytics crashlytics] log:@"Terminating due to natural stop."];
[NSApp terminate:nil];
}
}
@ -909,21 +720,20 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
- (void)audioPlayer:(AudioPlayer *)player restartPlaybackAtCurrentPosition:(id)userInfo {
PlaylistEntry *pe = [playlistController currentEntry];
BOOL paused = playbackStatus == CogStatusPaused;
//[SentrySDK captureMessage:[NSString stringWithFormat:@"Playback restarting for track: %@", pe.url]];
[player performSelectorOnMainThread:@selector(playBG:withUserInfo:withRGInfo:startPaused:andSeekTo:) withObjects:pe.url, pe, makeRGInfo(pe), @(paused), @(pe.seekable ? pe.currentPosition : 0.0), nil];
[[FIRCrashlytics crashlytics] logWithFormat:@"Restarting playback of track: %@", pe.url];
[player play:pe.url withUserInfo:pe withRGInfo:makeRGInfo(pe) startPaused:paused andSeekTo:pe.seekable ? pe.currentPosition : 0.0];
}
- (void)audioPlayer:(AudioPlayer *)player pushInfo:(NSDictionary *)info toTrack:(id)userInfo {
PlaylistEntry *pe = (PlaylistEntry *)userInfo;
if(!pe) pe = [playlistController currentEntry];
if (!pe) pe = [playlistController currentEntry];
[pe setMetadata:info];
[playlistView refreshTrack:pe];
// Delay the action until this function has returned to the audio thread
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 50 * NSEC_PER_MSEC), dispatch_get_main_queue(), ^{
self->playlistController.currentEntry = pe;
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 5 * NSEC_PER_MSEC), dispatch_get_main_queue(), ^{
[self sendMetaData];
[[NSNotificationCenter defaultCenter] postNotificationName:CogPlaybackDidBeginNotificiation object:pe];
});
[[NSNotificationCenter defaultCenter] postNotificationName:CogPlaybackDidBeginNotficiation object:pe];
}
- (void)audioPlayer:(AudioPlayer *)player reportPlayCountForTrack:(id)userInfo {
@ -933,15 +743,6 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
}
}
- (void)audioPlayer:(AudioPlayer *)player updatePosition:(id)userInfo {
if(userInfo) {
PlaylistEntry *pe = (PlaylistEntry *)userInfo;
if([pe current]) {
[self updatePosition:userInfo];
}
}
}
- (void)audioPlayer:(AudioPlayer *)player setError:(NSNumber *)status toTrack:(id)userInfo {
PlaylistEntry *pe = (PlaylistEntry *)userInfo;
[pe setError:[status boolValue]];
@ -1021,11 +822,9 @@ NSDictionary *makeRGInfo(PlaylistEntry *pe) {
if(entry.year) {
// If PlaylistEntry can represent a full date like some tag formats can do, change it
NSCalendar *calendar = [NSCalendar currentCalendar];
NSDate *releaseYear = [calendar dateWithEra:1 year:entry.year month:1 day:1 hour:0 minute:0 second:0 nanosecond:0];
if(releaseYear) {
NSDate *releaseYear = [calendar dateWithEra:1 year:entry.year month:0 day:0 hour:0 minute:0 second:0 nanosecond:0];
[songInfo setObject:releaseYear forKey:MPMediaItemPropertyReleaseDate];
}
}
[songInfo setObject:@(entry.currentPosition) forKey:MPNowPlayingInfoPropertyElapsedPlaybackTime];
[songInfo setObject:entry.length forKey:MPMediaItemPropertyPlaybackDuration];
[songInfo setObject:@(entry.index) forKey:MPMediaItemPropertyPersistentID];

View file

@ -322,19 +322,19 @@ didReceiveNotificationResponse:(UNNotificationResponse *)response
- (void)awakeFromNib {
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(playbackDidBegin:)
name:CogPlaybackDidBeginNotificiation
name:CogPlaybackDidBeginNotficiation
object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(playbackDidPause:)
name:CogPlaybackDidPauseNotificiation
name:CogPlaybackDidPauseNotficiation
object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(playbackDidResume:)
name:CogPlaybackDidResumeNotificiation
name:CogPlaybackDidResumeNotficiation
object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(playbackDidStop:)
name:CogPlaybackDidStopNotificiation
name:CogPlaybackDidStopNotficiation
object:nil];
}

View file

@ -8,7 +8,7 @@
#import <Cocoa/Cocoa.h>
#import <CogAudio/Plugin.h>
#import "Plugin.h"
@interface AudioDecoder : NSObject {
}

View file

@ -8,7 +8,7 @@
#import <Cocoa/Cocoa.h>
#import <CogAudio/CogSemaphore.h>
#import <CogAudio/Semaphore.h>
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>
@ -26,8 +26,6 @@
OutputNode *output;
double volume;
double pitch;
double tempo;
NSMutableArray *chainQueue;
@ -63,14 +61,12 @@
- (void)play:(NSURL *)url withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi;
- (void)play:(NSURL *)url withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi startPaused:(BOOL)paused;
- (void)play:(NSURL *)url withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi startPaused:(BOOL)paused andSeekTo:(double)time;
- (void)playBG:(NSURL *)url withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi startPaused:(NSNumber *)paused andSeekTo:(NSNumber *)time;
- (void)stop;
- (void)pause;
- (void)resume;
- (void)seekToTime:(double)time;
- (void)seekToTimeBG:(NSNumber *)time;
- (void)setVolume:(double)v;
- (double)volume;
- (double)volumeUp:(double)amount;
@ -138,6 +134,5 @@
- (void)audioPlayer:(AudioPlayer *)player restartPlaybackAtCurrentPosition:(id)userInfo;
- (void)audioPlayer:(AudioPlayer *)player pushInfo:(NSDictionary *)info toTrack:(id)userInfo;
- (void)audioPlayer:(AudioPlayer *)player reportPlayCountForTrack:(id)userInfo;
- (void)audioPlayer:(AudioPlayer *)player updatePosition:(id)userInfo;
- (void)audioPlayer:(AudioPlayer *)player setError:(NSNumber *)status toTrack:(id)userInfo;
@end

View file

@ -25,10 +25,6 @@
outputLaunched = NO;
endOfInputReached = NO;
// Safety
pitch = 1.0;
tempo = 1.0;
chainQueue = [[NSMutableArray alloc] init];
semaphore = [[Semaphore alloc] init];
@ -60,27 +56,18 @@
[self play:url withUserInfo:userInfo withRGInfo:rgi startPaused:paused andSeekTo:0.0];
}
- (void)playBG:(NSURL *)url withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi startPaused:(NSNumber *)paused andSeekTo:(NSNumber *)time {
@synchronized (self) {
[self play:url withUserInfo:userInfo withRGInfo:rgi startPaused:[paused boolValue] andSeekTo:[time doubleValue]];
}
}
- (void)play:(NSURL *)url withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi startPaused:(BOOL)paused andSeekTo:(double)time {
[self play:url withUserInfo:userInfo withRGInfo:rgi startPaused:paused andSeekTo:time andResumeInterval:NO];
}
- (void)play:(NSURL *)url withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi startPaused:(BOOL)paused andSeekTo:(double)time andResumeInterval:(BOOL)resumeInterval {
ALog(@"Opening file for playback: %@ at seek offset %f%@", url, time, (paused) ? @", starting paused" : @"");
[self waitUntilCallbacksExit];
if(output) {
[output fadeOutBackground];
[output setShouldContinue:NO];
[output close];
}
if(!output) {
output = [[OutputNode alloc] initWithController:self previous:nil];
[output setupWithInterval:resumeInterval];
}
[output setup];
[output setVolume:volume];
@synchronized(chainQueue) {
for(id anObject in chainQueue) {
@ -96,19 +83,13 @@
}
bufferChain = [[BufferChain alloc] initWithController:self];
if(!resumeInterval) {
[self notifyStreamChanged:userInfo];
}
while(![bufferChain open:url withOutputFormat:[output format] withUserInfo:userInfo withRGInfo:rgi]) {
while(![bufferChain open:url withUserInfo:userInfo withRGInfo:rgi]) {
bufferChain = nil;
[self requestNextStream:userInfo];
if([nextStream isEqualTo:url]) {
return;
}
url = nextStream;
if(url == nil) {
return;
@ -129,23 +110,15 @@
[self setShouldContinue:YES];
if(!resumeInterval) {
outputLaunched = NO;
}
startedPaused = paused;
initialBufferFilled = NO;
previousUserInfo = userInfo;
[bufferChain launchThreads];
if(paused) {
if(paused)
[self setPlaybackStatus:CogStatusPaused waitUntilDone:YES];
if(time > 0.0) {
[self updatePosition:userInfo];
}
} else if(resumeInterval) {
[output fadeIn];
}
}
- (void)stop {
@ -171,7 +144,7 @@
}
- (void)pause {
[output fadeOut];
[output pause];
[self setPlaybackStatus:CogStatusPaused waitUntilDone:YES];
}
@ -183,17 +156,15 @@
[self launchOutputThread];
}
[output fadeIn];
[output resume];
[self setPlaybackStatus:CogStatusPlaying waitUntilDone:YES];
}
- (void)seekToTimeBG:(NSNumber *)time {
[self seekToTime:[time doubleValue]];
}
- (void)seekToTime:(double)time {
if(endOfInputReached) {
// This is a dirty hack in case the playback has finished with the track
// that the user thinks they're seeking into
CogStatus status = (CogStatus)currentPlaybackStatus;
NSURL *url;
id userInfo;
@ -205,7 +176,14 @@
rgi = [bufferChain rgInfo];
}
[self play:url withUserInfo:userInfo withRGInfo:rgi startPaused:(status == CogStatusPaused) andSeekTo:time andResumeInterval:YES];
[self stop];
[self play:url withUserInfo:userInfo withRGInfo:rgi startPaused:(status == CogStatusPaused) andSeekTo:time];
} else {
// Still decoding the current file, safe to seek within it
[output seek:time];
[bufferChain seek:time];
}
}
- (void)setVolume:(double)v {
@ -251,10 +229,6 @@
[self sendDelegateMethod:@selector(audioPlayer:restartPlaybackAtCurrentPosition:) withObject:previousUserInfo waitUntilDone:NO];
}
- (void)updatePosition:(id)userInfo {
[self sendDelegateMethod:@selector(audioPlayer:updatePosition:) withObject:userInfo waitUntilDone:NO];
}
- (void)pushInfo:(NSDictionary *)info toTrack:(id)userInfo {
[self sendDelegateMethod:@selector(audioPlayer:pushInfo:toTrack:) withObject:info withObject:userInfo waitUntilDone:NO];
}
@ -337,8 +311,6 @@
// if there's already one at the head of chainQueue... r-r-right?
for(BufferChain *chain in chainQueue) {
if([chain isRunning]) {
if(output)
[output setShouldPlayOutBuffer:YES];
atomic_fetch_sub(&refCount, 1);
return YES;
}
@ -362,8 +334,6 @@
while(duration >= 30.0 && shouldContinue) {
[semaphore wait];
if(atomic_load_explicit(&resettingNow, memory_order_relaxed)) {
if(output)
[output setShouldPlayOutBuffer:YES];
atomic_fetch_sub(&refCount, 1);
return YES;
}
@ -383,24 +353,19 @@
[self requestNextStream:nextStreamUserInfo];
if(!nextStream) {
if(output)
[output setShouldPlayOutBuffer:YES];
atomic_fetch_sub(&refCount, 1);
return YES;
}
BufferChain *lastChain;
@synchronized(chainQueue) {
newChain = [[BufferChain alloc] initWithController:self];
endOfInputReached = YES;
lastChain = [chainQueue lastObject];
BufferChain *lastChain = [chainQueue lastObject];
if(lastChain == nil) {
lastChain = bufferChain;
}
}
BOOL pathsEqual = NO;
@ -410,36 +375,13 @@
if([unixPathNext isEqualToString:unixPathPrev])
pathsEqual = YES;
} else if(![nextStream isFileURL] && ![[lastChain streamURL] isFileURL]) {
@try {
NSURL *lastURL = [lastChain streamURL];
NSString *nextScheme = [nextStream scheme];
NSString *lastScheme = [lastURL scheme];
NSString *nextHost = [nextStream host];
NSString *lastHost = [lastURL host];
NSString *nextPath = [nextStream path];
NSString *lastPath = [lastURL path];
if(nextScheme && lastScheme && [nextScheme isEqualToString:lastScheme]) {
if((!nextHost && !lastHost) ||
(nextHost && lastHost && [nextHost isEqualToString:lastHost])) {
if(nextPath && lastPath && [nextPath isEqualToString:lastPath]) {
pathsEqual = YES;
}
}
}
}
@catch(NSException *e) {
DLog(@"Exception thrown checking file match: %@", e);
}
}
if(pathsEqual) {
if([lastChain setTrack:nextStream] && [newChain openWithInput:[lastChain inputNode] withOutputFormat:[output format] withUserInfo:nextStreamUserInfo withRGInfo:nextStreamRGInfo]) {
if(pathsEqual || ([[nextStream scheme] isEqualToString:[[lastChain streamURL] scheme]] && (([nextStream host] == nil && [[lastChain streamURL] host] == nil) || [[nextStream host] isEqualToString:[[lastChain streamURL] host]]) && [[nextStream path] isEqualToString:[[lastChain streamURL] path]])) {
if([lastChain setTrack:nextStream] && [newChain openWithInput:[lastChain inputNode] withUserInfo:nextStreamUserInfo withRGInfo:nextStreamRGInfo]) {
[newChain setStreamURL:nextStream];
@synchronized(chainQueue) {
[self addChainToQueue:newChain];
}
DLog(@"TRACK SET!!! %@", newChain);
// Keep on-playin
newChain = nil;
@ -451,13 +393,9 @@
lastChain = nil;
NSURL *url = nextStream;
while(shouldContinue && ![newChain open:url withOutputFormat:[output format] withUserInfo:nextStreamUserInfo withRGInfo:nextStreamRGInfo]) {
while(shouldContinue && ![newChain open:nextStream withUserInfo:nextStreamUserInfo withRGInfo:nextStreamRGInfo]) {
if(nextStream == nil) {
newChain = nil;
if(output)
[output setShouldPlayOutBuffer:YES];
atomic_fetch_sub(&refCount, 1);
return YES;
}
@ -465,22 +403,10 @@
newChain = nil;
[self requestNextStream:nextStreamUserInfo];
if([nextStream isEqualTo:url]) {
newChain = nil;
if(output)
[output setShouldPlayOutBuffer:YES];
atomic_fetch_sub(&refCount, 1);
return YES;
}
url = nextStream;
newChain = [[BufferChain alloc] initWithController:self];
}
@synchronized(chainQueue) {
[self addChainToQueue:newChain];
}
newChain = nil;
@ -494,9 +420,7 @@
// - self.nextStream == next playlist entry's URL
// - self.nextStreamUserInfo == next playlist entry
// - head of chainQueue is the buffer chain for the next entry (which has launched its threads already)
if(output)
[output setShouldPlayOutBuffer:YES];
}
atomic_fetch_sub(&refCount, 1);
return YES;
@ -518,7 +442,6 @@
break;
}
[bufferChain setShouldContinue:NO];
bufferChain = nil;
bufferChain = [chainQueue objectAtIndex:0];

View file

@ -65,17 +65,12 @@ enum {
AudioStreamBasicDescription format;
NSMutableData *chunkData;
uint32_t channelConfig;
double streamTimestamp;
double streamTimeRatio;
BOOL formatAssigned;
BOOL lossless;
BOOL hdcd;
}
@property AudioStreamBasicDescription format;
@property uint32_t channelConfig;
@property double streamTimestamp;
@property double streamTimeRatio;
@property BOOL lossless;
+ (uint32_t)guessChannelConfig:(uint32_t)channelCount;
@ -85,25 +80,16 @@ enum {
+ (uint32_t)findChannelIndex:(uint32_t)flag;
- (id)init;
- (id)initWithProperties:(NSDictionary *)properties;
- (void)assignSamples:(const void *_Nonnull)data frameCount:(size_t)count;
- (void)assignData:(NSData *)data;
- (void)assignSamples:(const void *)data frameCount:(size_t)count;
- (NSData *)removeSamples:(size_t)frameCount;
- (BOOL)isEmpty;
- (size_t)frameCount;
- (void)setFrameCount:(size_t)count; // For truncation only
- (double)duration;
- (double)durationRatioed;
- (BOOL)isHDCD;
- (void)setHDCD;
- (AudioChunk *)copy;
@end

View file

@ -7,8 +7,6 @@
#import "AudioChunk.h"
#import "CoreAudioUtils.h"
@implementation AudioChunk
- (id)init {
@ -18,40 +16,11 @@
chunkData = [[NSMutableData alloc] init];
formatAssigned = NO;
lossless = NO;
hdcd = NO;
streamTimestamp = 0.0;
streamTimeRatio = 1.0;
}
return self;
}
- (id)initWithProperties:(NSDictionary *)properties {
self = [super init];
if(self) {
chunkData = [[NSMutableData alloc] init];
[self setFormat:propertiesToASBD(properties)];
lossless = [[properties objectForKey:@"encoding"] isEqualToString:@"lossless"];
hdcd = NO;
streamTimestamp = 0.0;
streamTimeRatio = 1.0;
}
return self;
}
- (AudioChunk *)copy {
AudioChunk *outputChunk = [[AudioChunk alloc] init];
[outputChunk setFormat:format];
[outputChunk setChannelConfig:channelConfig];
if(hdcd) [outputChunk setHDCD];
[outputChunk setStreamTimestamp:streamTimestamp];
[outputChunk setStreamTimeRatio:streamTimeRatio];
[outputChunk assignData:chunkData];
return outputChunk;
}
static const uint32_t AudioChannelConfigTable[] = {
0,
AudioConfigMono,
@ -133,8 +102,6 @@ static const uint32_t AudioChannelConfigTable[] = {
}
@synthesize lossless;
@synthesize streamTimestamp;
@synthesize streamTimeRatio;
- (AudioStreamBasicDescription)format {
return format;
@ -159,30 +126,21 @@ static const uint32_t AudioChannelConfigTable[] = {
channelConfig = config;
}
- (void)assignSamples:(const void *_Nonnull)data frameCount:(size_t)count {
- (void)assignSamples:(const void *)data frameCount:(size_t)count {
if(formatAssigned) {
const size_t bytesPerPacket = format.mBytesPerPacket;
[chunkData appendBytes:data length:bytesPerPacket * count];
}
}
- (void)assignData:(NSData *)data {
[chunkData appendData:data];
}
- (NSData *)removeSamples:(size_t)frameCount {
if(formatAssigned) {
@autoreleasepool {
const double secondsDuration = (double)(frameCount) / format.mSampleRate;
const double DSDrate = (format.mBitsPerChannel == 1) ? 8.0 : 1.0;
const size_t bytesPerPacket = format.mBytesPerPacket;
const size_t byteCount = bytesPerPacket * frameCount;
NSData *ret = [chunkData subdataWithRange:NSMakeRange(0, byteCount)];
[chunkData replaceBytesInRange:NSMakeRange(0, byteCount) withBytes:NULL length:0];
streamTimestamp += secondsDuration * streamTimeRatio * DSDrate;
return ret;
}
}
return [NSData data];
}
@ -198,36 +156,13 @@ static const uint32_t AudioChannelConfigTable[] = {
return 0;
}
- (void)setFrameCount:(size_t)count {
if(formatAssigned) {
count *= format.mBytesPerPacket;
size_t currentLength = [chunkData length];
if(count < currentLength) {
[chunkData replaceBytesInRange:NSMakeRange(count, currentLength - count) withBytes:NULL length:0];
}
}
}
- (double)duration {
if(formatAssigned && [chunkData length]) {
if(formatAssigned) {
const size_t bytesPerPacket = format.mBytesPerPacket;
const double sampleRate = format.mSampleRate;
const double DSDrate = (format.mBitsPerChannel == 1) ? 8.0 : 1.0;
return ((double)([chunkData length] / bytesPerPacket) / sampleRate) * DSDrate;
return (double)([chunkData length] / bytesPerPacket) / sampleRate;
}
return 0.0;
}
- (double)durationRatioed {
return [self duration] * streamTimeRatio;
}
- (BOOL)isHDCD {
return hdcd;
}
- (void)setHDCD {
hdcd = YES;
}
@end

View file

@ -8,9 +8,9 @@
#import <Cocoa/Cocoa.h>
#import <CogAudio/AudioPlayer.h>
#import <CogAudio/ConverterNode.h>
#import <CogAudio/InputNode.h>
#import "AudioPlayer.h"
#import "ConverterNode.h"
#import "InputNode.h"
@interface BufferChain : NSObject {
InputNode *inputNode;
@ -26,16 +26,15 @@
}
- (id)initWithController:(id)c;
- (BOOL)buildChain;
- (void)buildChain;
- (BOOL)open:(NSURL *)url withOutputFormat:(AudioStreamBasicDescription)outputFormat withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi;
- (BOOL)open:(NSURL *)url withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi;
// Used when changing tracks to reuse the same decoder
- (BOOL)openWithInput:(InputNode *)i withOutputFormat:(AudioStreamBasicDescription)outputFormat withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi;
- (BOOL)openWithInput:(InputNode *)i withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi;
// Used when resetting the decoder on seek
- (BOOL)openWithDecoder:(id<CogDecoder>)decoder
withOutputFormat:(AudioStreamBasicDescription)outputFormat
withUserInfo:(id)userInfo
withRGInfo:(NSDictionary *)rgi;

View file

@ -9,11 +9,8 @@
#import "BufferChain.h"
#import "AudioSource.h"
#import "CoreAudioUtils.h"
#import "DSPDownmixNode.h"
#import "OutputNode.h"
#import "AudioPlayer.h"
#import "Logging.h"
@implementation BufferChain
@ -33,32 +30,21 @@
return self;
}
- (BOOL)buildChain {
// Cut off output source
finalNode = nil;
// Tear them down in reverse
converterNode = nil;
- (void)buildChain {
inputNode = nil;
converterNode = nil;
inputNode = [[InputNode alloc] initWithController:self previous:nil];
if(!inputNode) return NO;
converterNode = [[ConverterNode alloc] initWithController:self previous:inputNode];
if(!converterNode) return NO;
finalNode = converterNode;
return YES;
}
- (BOOL)open:(NSURL *)url withOutputFormat:(AudioStreamBasicDescription)outputFormat withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi {
- (BOOL)open:(NSURL *)url withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi {
[self setStreamURL:url];
[self setUserInfo:userInfo];
if (![self buildChain]) {
DLog(@"Couldn't build processing chain...");
return NO;
}
[self buildChain];
id<CogSource> source = [AudioSource audioSourceForURL:url];
DLog(@"Opening: %@", url);
@ -73,9 +59,15 @@
if(![inputNode openWithSource:source])
return NO;
if(![self initConverter:outputFormat])
NSDictionary *properties = [inputNode properties];
AudioStreamBasicDescription inputFormat = [inputNode nodeFormat];
uint32_t inputChannelConfig = 0;
if([properties valueForKey:@"channelConfig"])
inputChannelConfig = [[properties valueForKey:@"channelConfig"] unsignedIntValue];
if(![converterNode setupWithInputFormat:inputFormat withInputConfig:inputChannelConfig isLossless:[[properties valueForKey:@"encoding"] isEqualToString:@"lossless"]])
return NO;
[self initDownmixer];
[self setRGInfo:rgi];
@ -84,20 +76,24 @@
return YES;
}
- (BOOL)openWithInput:(InputNode *)i withOutputFormat:(AudioStreamBasicDescription)outputFormat withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi {
- (BOOL)openWithInput:(InputNode *)i withUserInfo:(id)userInfo withRGInfo:(NSDictionary *)rgi {
DLog(@"New buffer chain!");
[self setUserInfo:userInfo];
if(![self buildChain]) {
DLog(@"Couldn't build processing chain...");
return NO;
}
[self buildChain];
if(![inputNode openWithDecoder:[i decoder]])
return NO;
if(![self initConverter:outputFormat])
NSDictionary *properties = [inputNode properties];
AudioStreamBasicDescription inputFormat = [inputNode nodeFormat];
uint32_t inputChannelConfig = 0;
if([properties valueForKey:@"channelConfig"])
inputChannelConfig = [[properties valueForKey:@"channelConfig"] unsignedIntValue];
DLog(@"Input Properties: %@", properties);
if(![converterNode setupWithInputFormat:inputFormat withInputConfig:inputChannelConfig isLossless:[[properties objectForKey:@"encoding"] isEqualToString:@"lossless"]])
return NO;
[self initDownmixer];
[self setRGInfo:rgi];
@ -105,30 +101,16 @@
}
- (BOOL)openWithDecoder:(id<CogDecoder>)decoder
withOutputFormat:(AudioStreamBasicDescription)outputFormat
withUserInfo:(id)userInfo
withRGInfo:(NSDictionary *)rgi;
{
DLog(@"New buffer chain!");
[self setUserInfo:userInfo];
if(![self buildChain]) {
DLog(@"Couldn't build processing chain...");
return NO;
}
[self buildChain];
if(![inputNode openWithDecoder:decoder])
return NO;
if(![self initConverter:outputFormat])
return NO;
[self initDownmixer];
[self setRGInfo:rgi];
return YES;
}
- (BOOL)initConverter:(AudioStreamBasicDescription)outputFormat {
NSDictionary *properties = [inputNode properties];
DLog(@"Input Properties: %@", properties);
@ -138,21 +120,12 @@
if([properties valueForKey:@"channelConfig"])
inputChannelConfig = [[properties valueForKey:@"channelConfig"] unsignedIntValue];
outputFormat.mChannelsPerFrame = inputFormat.mChannelsPerFrame;
outputFormat.mBytesPerFrame = ((outputFormat.mBitsPerChannel + 7) / 8) * outputFormat.mChannelsPerFrame;
outputFormat.mBytesPerPacket = outputFormat.mBytesPerFrame * outputFormat.mFramesPerPacket;
if(![converterNode setupWithInputFormat:inputFormat withInputConfig:inputChannelConfig outputFormat:outputFormat isLossless:[[properties valueForKey:@"encoding"] isEqualToString:@"lossless"]])
if(![converterNode setupWithInputFormat:inputFormat withInputConfig:inputChannelConfig isLossless:[[properties objectForKey:@"encoding"] isEqualToString:@"lossless"]])
return NO;
return YES;
}
[self setRGInfo:rgi];
- (void)initDownmixer {
AudioPlayer * audioPlayer = controller;
OutputNode *outputNode = [audioPlayer output];
DSPDownmixNode *downmixNode = [outputNode downmix];
[downmixNode setOutputFormat:[outputNode deviceFormat] withChannelConfig:[outputNode deviceChannelConfig]];
return YES;
}
- (void)launchThreads {
@ -182,8 +155,7 @@
- (void)dealloc {
[inputNode setShouldContinue:NO];
[[inputNode exitAtTheEndOfTheStream] signal];
[[inputNode writeSemaphore] signal];
if(![inputNode threadExited])
[[inputNode semaphore] signal];
[[inputNode exitAtTheEndOfTheStream] wait]; // wait for decoder to be closed (see InputNode's -(void)process )
DLog(@"Bufferchain dealloc");

View file

@ -8,57 +8,24 @@
#import <CoreAudio/CoreAudio.h>
#import <Foundation/Foundation.h>
#import <CogAudio/AudioChunk.h>
#import "AudioChunk.h"
#import <CogAudio/CogSemaphore.h>
#import "Semaphore.h"
NS_ASSUME_NONNULL_BEGIN
#define DSD_DECIMATE 1
@interface ChunkList : NSObject {
NSMutableArray<AudioChunk *> *chunkList;
double listDuration;
double listDurationRatioed;
double maxDuration;
BOOL inAdder;
BOOL inRemover;
BOOL inPeeker;
BOOL inMerger;
BOOL inConverter;
BOOL stopping;
// For format converter
void *inputBuffer;
size_t inputBufferSize;
#if DSD_DECIMATE
void **dsd2pcm;
size_t dsd2pcmCount;
int dsd2pcmLatency;
#endif
BOOL observersRegistered;
BOOL halveDSDVolume;
BOOL enableHDCD;
void *hdcd_decoder;
BOOL formatRead;
AudioStreamBasicDescription inputFormat;
AudioStreamBasicDescription floatFormat;
uint32_t inputChannelConfig;
BOOL inputLossless;
uint8_t *tempData;
size_t tempDataSize;
}
@property(readonly) double listDuration;
@property(readonly) double listDurationRatioed;
@property(readonly) double maxDuration;
- (id)initWithMaximumDuration:(double)duration;
@ -71,16 +38,8 @@ NS_ASSUME_NONNULL_BEGIN
- (void)addChunk:(AudioChunk *)chunk;
- (AudioChunk *)removeSamples:(size_t)maxFrameCount;
- (AudioChunk *)removeSamplesAsFloat32:(size_t)maxFrameCount;
- (BOOL)peekFormat:(nonnull AudioStreamBasicDescription *)format channelConfig:(nonnull uint32_t *)config;
- (BOOL)peekTimestamp:(nonnull double *)timestamp timeRatio:(nonnull double *)timeRatio;
// Helpers
- (AudioChunk *)removeAndMergeSamples:(size_t)maxFrameCount callBlock:(BOOL(NS_NOESCAPE ^ _Nonnull)(void))block;
- (AudioChunk *)removeAndMergeSamplesAsFloat32:(size_t)maxFrameCount callBlock:(BOOL(NS_NOESCAPE ^ _Nonnull)(void))block;
@end
NS_ASSUME_NONNULL_END

View file

@ -5,371 +5,11 @@
// Created by Christopher Snowhill on 2/5/22.
//
#import <Accelerate/Accelerate.h>
#import "ChunkList.h"
#import "hdcd_decode2.h"
#if !DSD_DECIMATE
#import "dsd2float.h"
#endif
#ifdef _DEBUG
#import "BadSampleCleaner.h"
#endif
static void *kChunkListContext = &kChunkListContext;
#if DSD_DECIMATE
/**
* DSD 2 PCM: Stage 1:
* Decimate by factor 8
* (one byte (8 samples) -> one float sample)
* The bits are processed from least signicifant to most signicicant.
* @author Sebastian Gesemann
*/
/**
* This is the 2nd half of an even order symmetric FIR
* lowpass filter (to be used on a signal sampled at 44100*64 Hz)
* Passband is 0-24 kHz (ripples +/- 0.025 dB)
* Stopband starts at 176.4 kHz (rejection: 170 dB)
* The overall gain is 2.0
*/
#define dsd2pcm_FILTER_COEFFS_COUNT 64
static const float dsd2pcm_FILTER_COEFFS[64] = {
0.09712411121659f, 0.09613438994044f, 0.09417884216316f, 0.09130441727307f,
0.08757947648990f, 0.08309142055179f, 0.07794369263673f, 0.07225228745463f,
0.06614191680338f, 0.05974199351302f, 0.05318259916599f, 0.04659059631228f,
0.04008603356890f, 0.03377897290478f, 0.02776684382775f, 0.02213240062966f,
0.01694232798846f, 0.01224650881275f, 0.00807793792573f, 0.00445323755944f,
0.00137370697215f, -0.00117318019994f, -0.00321193033831f, -0.00477694265140f,
-0.00591028841335f, -0.00665946056286f, -0.00707518873201f, -0.00720940203988f,
-0.00711340642819f, -0.00683632603227f, -0.00642384017266f, -0.00591723006715f,
-0.00535273320457f, -0.00476118922548f, -0.00416794965654f, -0.00359301524813f,
-0.00305135909510f, -0.00255339111833f, -0.00210551956895f, -0.00171076760278f,
-0.00136940723130f, -0.00107957856005f, -0.00083786862365f, -0.00063983084245f,
-0.00048043272086f, -0.00035442550015f, -0.00025663481039f, -0.00018217573430f,
-0.00012659899635f, -0.00008597726991f, -0.00005694188820f, -0.00003668060332f,
-0.00002290670286f, -0.00001380895679f, -0.00000799057558f, -0.00000440385083f,
-0.00000228567089f, -0.00000109760778f, -0.00000047286430f, -0.00000017129652f,
-0.00000004282776f, 0.00000000119422f, 0.00000000949179f, 0.00000000747450f
};
struct dsd2pcm_state {
/*
* This is the 2nd half of an even order symmetric FIR
* lowpass filter (to be used on a signal sampled at 44100*64 Hz)
* Passband is 0-24 kHz (ripples +/- 0.025 dB)
* Stopband starts at 176.4 kHz (rejection: 170 dB)
* The overall gain is 2.0
*/
/* These remain constant for the duration */
int FILT_LOOKUP_PARTS;
float *FILT_LOOKUP_TABLE;
uint8_t *REVERSE_BITS;
int FIFO_LENGTH;
int FIFO_OFS_MASK;
/* These are altered */
int *fifo;
int fpos;
};
static void dsd2pcm_free(void *);
static void dsd2pcm_reset(void *);
static void *dsd2pcm_alloc(void) {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)calloc(1, sizeof(struct dsd2pcm_state));
float *FILT_LOOKUP_TABLE;
double *temp;
uint8_t *REVERSE_BITS;
if(!state)
return NULL;
state->FILT_LOOKUP_PARTS = (dsd2pcm_FILTER_COEFFS_COUNT + 7) / 8;
const int FILT_LOOKUP_PARTS = state->FILT_LOOKUP_PARTS;
// The current 128 tap FIR leads to an 8 KB lookup table
state->FILT_LOOKUP_TABLE = (float *)calloc(sizeof(float), FILT_LOOKUP_PARTS << 8);
if(!state->FILT_LOOKUP_TABLE)
goto fail;
FILT_LOOKUP_TABLE = state->FILT_LOOKUP_TABLE;
temp = (double *)calloc(sizeof(double), 0x100);
if(!temp)
goto fail;
for(int part = 0, sofs = 0, dofs = 0; part < FILT_LOOKUP_PARTS;) {
memset(temp, 0, 0x100 * sizeof(double));
for(int bit = 0, bitmask = 0x80; bit < 8 && sofs + bit < dsd2pcm_FILTER_COEFFS_COUNT;) {
double coeff = dsd2pcm_FILTER_COEFFS[sofs + bit];
for(int bite = 0; bite < 0x100; bite++) {
if((bite & bitmask) == 0) {
temp[bite] -= coeff;
} else {
temp[bite] += coeff;
}
}
bit++;
bitmask >>= 1;
}
for(int s = 0; s < 0x100;) {
FILT_LOOKUP_TABLE[dofs++] = (float)temp[s++];
}
part++;
sofs += 8;
}
free(temp);
{ // calculate FIFO stuff
int k = 1;
while(k < FILT_LOOKUP_PARTS * 2) k <<= 1;
state->FIFO_LENGTH = k;
state->FIFO_OFS_MASK = k - 1;
}
state->REVERSE_BITS = (uint8_t *)calloc(1, 0x100);
if(!state->REVERSE_BITS)
goto fail;
REVERSE_BITS = state->REVERSE_BITS;
for(int i = 0, j = 0; i < 0x100; i++) {
REVERSE_BITS[i] = (uint8_t)j;
// "reverse-increment" of j
for(int bitmask = 0x80;;) {
if(((j ^= bitmask) & bitmask) != 0) break;
if(bitmask == 1) break;
bitmask >>= 1;
}
}
state->fifo = (int *)calloc(sizeof(int), state->FIFO_LENGTH);
if(!state->fifo)
goto fail;
dsd2pcm_reset(state);
return (void *)state;
fail:
dsd2pcm_free(state);
return NULL;
}
static void *dsd2pcm_dup(void *_state) {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)_state;
if(state) {
struct dsd2pcm_state *newstate = (struct dsd2pcm_state *)calloc(1, sizeof(struct dsd2pcm_state));
if(newstate) {
newstate->FILT_LOOKUP_PARTS = state->FILT_LOOKUP_PARTS;
newstate->FIFO_LENGTH = state->FIFO_LENGTH;
newstate->FIFO_OFS_MASK = state->FIFO_OFS_MASK;
newstate->fpos = state->fpos;
newstate->FILT_LOOKUP_TABLE = (float *)calloc(sizeof(float), state->FILT_LOOKUP_PARTS << 8);
if(!newstate->FILT_LOOKUP_TABLE)
goto fail;
memcpy(newstate->FILT_LOOKUP_TABLE, state->FILT_LOOKUP_TABLE, sizeof(float) * (state->FILT_LOOKUP_PARTS << 8));
newstate->REVERSE_BITS = (uint8_t *)calloc(1, 0x100);
if(!newstate->REVERSE_BITS)
goto fail;
memcpy(newstate->REVERSE_BITS, state->REVERSE_BITS, 0x100);
newstate->fifo = (int *)calloc(sizeof(int), state->FIFO_LENGTH);
if(!newstate->fifo)
goto fail;
memcpy(newstate->fifo, state->fifo, sizeof(int) * state->FIFO_LENGTH);
return (void *)newstate;
}
fail:
dsd2pcm_free(newstate);
return NULL;
}
return NULL;
}
static void dsd2pcm_free(void *_state) {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)_state;
if(state) {
free(state->fifo);
free(state->REVERSE_BITS);
free(state->FILT_LOOKUP_TABLE);
free(state);
}
}
static void dsd2pcm_reset(void *_state) {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)_state;
const int FILT_LOOKUP_PARTS = state->FILT_LOOKUP_PARTS;
int *fifo = state->fifo;
for(int i = 0; i < FILT_LOOKUP_PARTS; i++) {
fifo[i] = 0x55;
fifo[i + FILT_LOOKUP_PARTS] = 0xAA;
}
state->fpos = FILT_LOOKUP_PARTS;
}
static int dsd2pcm_latency(void *_state) {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)_state;
if(state)
return state->FILT_LOOKUP_PARTS * 8;
else
return 0;
}
static void dsd2pcm_process(void *_state, const uint8_t *src, size_t sofs, size_t sinc, float *dest, size_t dofs, size_t dinc, size_t len) {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)_state;
int bite1, bite2, temp;
float sample;
int *fifo = state->fifo;
const uint8_t *REVERSE_BITS = state->REVERSE_BITS;
const float *FILT_LOOKUP_TABLE = state->FILT_LOOKUP_TABLE;
const int FILT_LOOKUP_PARTS = state->FILT_LOOKUP_PARTS;
const int FIFO_OFS_MASK = state->FIFO_OFS_MASK;
int fpos = state->fpos;
while(len > 0) {
fifo[fpos] = REVERSE_BITS[fifo[fpos]] & 0xFF;
fifo[(fpos + FILT_LOOKUP_PARTS) & FIFO_OFS_MASK] = src[sofs] & 0xFF;
sofs += sinc;
temp = (fpos + 1) & FIFO_OFS_MASK;
sample = 0;
for(int k = 0, lofs = 0; k < FILT_LOOKUP_PARTS;) {
bite1 = fifo[(fpos - k) & FIFO_OFS_MASK];
bite2 = fifo[(temp + k) & FIFO_OFS_MASK];
sample += FILT_LOOKUP_TABLE[lofs + bite1] + FILT_LOOKUP_TABLE[lofs + bite2];
k++;
lofs += 0x100;
}
fpos = temp;
dest[dofs] = sample;
dofs += dinc;
len--;
}
state->fpos = fpos;
}
static void convert_dsd_to_f32(float *output, const uint8_t *input, size_t count, size_t channels, void **dsd2pcm) {
for(size_t channel = 0; channel < channels; ++channel) {
dsd2pcm_process(dsd2pcm[channel], input, channel, channels, output, channel, channels, count);
}
}
#else
static void convert_dsd_to_f32(float *output, const uint8_t *input, size_t count, size_t channels) {
const uint8_t *iptr = input;
float *optr = output;
for(size_t index = 0; index < count; ++index) {
for(size_t channel = 0; channel < channels; ++channel) {
uint8_t sample = *iptr++;
cblas_scopy(8, &dsd2float[sample][0], 1, optr++, (int)channels);
}
optr += channels * 7;
}
}
#endif
static void convert_u8_to_s16(int16_t *output, const uint8_t *input, size_t count) {
for(size_t i = 0; i < count; ++i) {
uint16_t sample = (input[i] << 8) | input[i];
sample ^= 0x8080;
output[i] = (int16_t)(sample);
}
}
static void convert_s8_to_s16(int16_t *output, const uint8_t *input, size_t count) {
for(size_t i = 0; i < count; ++i) {
uint16_t sample = (input[i] << 8) | input[i];
output[i] = (int16_t)(sample);
}
}
static void convert_u16_to_s16(int16_t *buffer, size_t count) {
for(size_t i = 0; i < count; ++i) {
buffer[i] ^= 0x8000;
}
}
static void convert_s16_to_hdcd_input(int32_t *output, const int16_t *input, size_t count) {
for(size_t i = 0; i < count; ++i) {
output[i] = input[i];
}
}
static void convert_s24_to_s32(int32_t *output, const uint8_t *input, size_t count) {
for(size_t i = 0; i < count; ++i) {
int32_t sample = (input[i * 3] << 8) | (input[i * 3 + 1] << 16) | (input[i * 3 + 2] << 24);
output[i] = sample;
}
}
static void convert_u24_to_s32(int32_t *output, const uint8_t *input, size_t count) {
for(size_t i = 0; i < count; ++i) {
int32_t sample = (input[i * 3] << 8) | (input[i * 3 + 1] << 16) | (input[i * 3 + 2] << 24);
output[i] = sample ^ 0x80000000;
}
}
static void convert_u32_to_s32(int32_t *buffer, size_t count) {
for(size_t i = 0; i < count; ++i) {
buffer[i] ^= 0x80000000;
}
}
static void convert_f64_to_f32(float *output, const double *input, size_t count) {
vDSP_vdpsp(input, 1, output, 1, count);
}
static void convert_be_to_le(uint8_t *buffer, size_t bitsPerSample, size_t bytes) {
size_t i;
bitsPerSample = (bitsPerSample + 7) / 8;
switch(bitsPerSample) {
case 2:
for(i = 0; i < bytes; i += 2) {
*(int16_t *)buffer = __builtin_bswap16(*(int16_t *)buffer);
buffer += 2;
}
break;
case 3: {
union {
vDSP_int24 int24;
uint32_t int32;
} intval;
intval.int32 = 0;
for(i = 0; i < bytes; i += 3) {
intval.int24 = *(vDSP_int24 *)buffer;
intval.int32 = __builtin_bswap32(intval.int32 << 8);
*(vDSP_int24 *)buffer = intval.int24;
buffer += 3;
}
} break;
case 4:
for(i = 0; i < bytes; i += 4) {
*(uint32_t *)buffer = __builtin_bswap32(*(uint32_t *)buffer);
buffer += 4;
}
break;
case 8:
for(i = 0; i < bytes; i += 8) {
*(uint64_t *)buffer = __builtin_bswap64(*(uint64_t *)buffer);
buffer += 8;
}
break;
}
}
@implementation ChunkList
@synthesize listDuration;
@synthesize listDurationRatioed;
@synthesize maxDuration;
- (id)initWithMaximumDuration:(double)duration {
@ -378,97 +18,28 @@ static void convert_be_to_le(uint8_t *buffer, size_t bitsPerSample, size_t bytes
if(self) {
chunkList = [[NSMutableArray alloc] init];
listDuration = 0.0;
listDurationRatioed = 0.0;
maxDuration = duration;
inAdder = NO;
inRemover = NO;
inPeeker = NO;
inMerger = NO;
inConverter = NO;
stopping = NO;
formatRead = NO;
inputBuffer = NULL;
inputBufferSize = 0;
#if DSD_DECIMATE
dsd2pcm = NULL;
dsd2pcmCount = 0;
dsd2pcmLatency = 0;
#endif
observersRegistered = NO;
}
return self;
}
- (void)addObservers {
if(!observersRegistered) {
halveDSDVolume = NO;
enableHDCD = NO;
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.halveDSDVolume" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kChunkListContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.enableHDCD" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kChunkListContext];
observersRegistered = YES;
}
}
- (void)removeObservers {
if(observersRegistered) {
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.halveDSDVolume" context:kChunkListContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.enableHDCD" context:kChunkListContext];
observersRegistered = NO;
}
}
- (void)dealloc {
stopping = YES;
while(inAdder || inRemover || inPeeker || inMerger || inConverter) {
while(inAdder || inRemover || inPeeker) {
usleep(500);
}
[self removeObservers];
if(hdcd_decoder) {
free(hdcd_decoder);
hdcd_decoder = NULL;
}
#if DSD_DECIMATE
if(dsd2pcm && dsd2pcmCount) {
for(size_t i = 0; i < dsd2pcmCount; ++i) {
dsd2pcm_free(dsd2pcm[i]);
dsd2pcm[i] = NULL;
}
free(dsd2pcm);
dsd2pcm = NULL;
}
#endif
if(tempData) {
free(tempData);
}
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if(context != kChunkListContext) {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
return;
}
if([keyPath isEqualToString:@"values.halveDSDVolume"]) {
halveDSDVolume = [[[NSUserDefaultsController sharedUserDefaultsController] defaults] boolForKey:@"halveDSDVolume"];
} else if([keyPath isEqualToString:@"values.enableHDCD"]) {
enableHDCD = [[[NSUserDefaultsController sharedUserDefaultsController] defaults] boolForKey:@"enableHDCD"];
}
}
- (void)reset {
@synchronized(chunkList) {
[chunkList removeAllObjects];
listDuration = 0.0;
listDurationRatioed = 0.0;
}
}
@ -479,9 +50,7 @@ static void convert_be_to_le(uint8_t *buffer, size_t bitsPerSample, size_t bytes
}
- (BOOL)isFull {
@synchronized (chunkList) {
return (maxDuration - listDuration) < 0.001;
}
return (maxDuration - listDuration) < 0.05;
}
- (void)addChunk:(AudioChunk *)chunk {
@ -490,12 +59,10 @@ static void convert_be_to_le(uint8_t *buffer, size_t bitsPerSample, size_t bytes
inAdder = YES;
const double chunkDuration = [chunk duration];
const double chunkDurationRatioed = [chunk durationRatioed];
@synchronized(chunkList) {
[chunkList addObject:chunk];
listDuration += chunkDuration;
listDurationRatioed += chunkDurationRatioed;
}
inAdder = NO;
@ -516,461 +83,30 @@ static void convert_be_to_le(uint8_t *buffer, size_t bitsPerSample, size_t bytes
if([chunk frameCount] <= maxFrameCount) {
[chunkList removeObjectAtIndex:0];
listDuration -= [chunk duration];
listDurationRatioed -= [chunk durationRatioed];
inRemover = NO;
return chunk;
}
double streamTimestamp = [chunk streamTimestamp];
NSData *removedData = [chunk removeSamples:maxFrameCount];
AudioChunk *ret = [[AudioChunk alloc] init];
[ret setFormat:[chunk format]];
[ret setChannelConfig:[chunk channelConfig]];
[ret setLossless:[chunk lossless]];
[ret setStreamTimestamp:streamTimestamp];
[ret setStreamTimeRatio:[chunk streamTimeRatio]];
[ret assignData:removedData];
[ret assignSamples:[removedData bytes] frameCount:maxFrameCount];
listDuration -= [ret duration];
listDurationRatioed -= [ret durationRatioed];
inRemover = NO;
return ret;
}
}
- (AudioChunk *)removeSamplesAsFloat32:(size_t)maxFrameCount {
if(stopping) {
return [[AudioChunk alloc] init];
}
@synchronized (chunkList) {
inRemover = YES;
if(![chunkList count]) {
inRemover = NO;
return [[AudioChunk alloc] init];
}
AudioChunk *chunk = [chunkList objectAtIndex:0];
#if !DSD_DECIMATE
AudioStreamBasicDescription asbd = [chunk format];
if(asbd.mBitsPerChannel == 1) {
maxFrameCount /= 8;
}
#endif
if([chunk frameCount] <= maxFrameCount) {
[chunkList removeObjectAtIndex:0];
listDuration -= [chunk duration];
listDurationRatioed -= [chunk durationRatioed];
inRemover = NO;
return [self convertChunk:chunk];
}
double streamTimestamp = [chunk streamTimestamp];
NSData *removedData = [chunk removeSamples:maxFrameCount];
AudioChunk *ret = [[AudioChunk alloc] init];
[ret setFormat:[chunk format]];
[ret setChannelConfig:[chunk channelConfig]];
[ret setLossless:[chunk lossless]];
[ret setStreamTimestamp:streamTimestamp];
[ret setStreamTimeRatio:[chunk streamTimeRatio]];
[ret assignData:removedData];
listDuration -= [ret duration];
listDurationRatioed -= [ret durationRatioed];
inRemover = NO;
return [self convertChunk:ret];
}
}
- (AudioChunk *)removeAndMergeSamples:(size_t)maxFrameCount callBlock:(BOOL(NS_NOESCAPE ^ _Nonnull)(void))block {
if(stopping) {
return [[AudioChunk alloc] init];
}
inMerger = YES;
BOOL formatSet = NO;
AudioStreamBasicDescription currentFormat;
uint32_t currentChannelConfig = 0;
double streamTimestamp = 0.0;
double streamTimeRatio = 1.0;
BOOL blocked = NO;
while(![self peekTimestamp:&streamTimestamp timeRatio:&streamTimeRatio]) {
if((blocked = block())) {
break;
}
}
if(blocked) {
inMerger = NO;
return [[AudioChunk alloc] init];
}
AudioChunk *chunk;
size_t totalFrameCount = 0;
AudioChunk *outputChunk = [[AudioChunk alloc] init];
[outputChunk setStreamTimestamp:streamTimestamp];
[outputChunk setStreamTimeRatio:streamTimeRatio];
while(!stopping && totalFrameCount < maxFrameCount) {
AudioStreamBasicDescription newFormat;
uint32_t newChannelConfig;
if(![self peekFormat:&newFormat channelConfig:&newChannelConfig]) {
if(block()) {
break;
}
continue;
}
if(formatSet &&
(memcmp(&newFormat, &currentFormat, sizeof(newFormat)) != 0 ||
newChannelConfig != currentChannelConfig)) {
break;
} else if(!formatSet) {
[outputChunk setFormat:newFormat];
[outputChunk setChannelConfig:newChannelConfig];
currentFormat = newFormat;
currentChannelConfig = newChannelConfig;
formatSet = YES;
}
chunk = [self removeSamples:maxFrameCount - totalFrameCount];
if(!chunk || ![chunk frameCount]) {
if(block()) {
break;
}
continue;
}
if([chunk isHDCD]) {
[outputChunk setHDCD];
}
size_t frameCount = [chunk frameCount];
NSData *sampleData = [chunk removeSamples:frameCount];
[outputChunk assignData:sampleData];
totalFrameCount += frameCount;
}
if(!totalFrameCount) {
inMerger = NO;
return [[AudioChunk alloc] init];
}
inMerger = NO;
return outputChunk;
}
- (AudioChunk *)removeAndMergeSamplesAsFloat32:(size_t)maxFrameCount callBlock:(BOOL(NS_NOESCAPE ^ _Nonnull)(void))block {
AudioChunk *ret = [self removeAndMergeSamples:maxFrameCount callBlock:block];
return [self convertChunk:ret];
}
- (AudioChunk *)convertChunk:(AudioChunk *)inChunk {
if(stopping) return [[AudioChunk alloc] init];
inConverter = YES;
AudioStreamBasicDescription chunkFormat = [inChunk format];
if(![inChunk frameCount] ||
(chunkFormat.mFormatFlags == kAudioFormatFlagsNativeFloatPacked &&
chunkFormat.mBitsPerChannel == 32)) {
inConverter = NO;
return inChunk;
}
uint32_t chunkConfig = [inChunk channelConfig];
BOOL chunkLossless = [inChunk lossless];
if(!formatRead || memcmp(&chunkFormat, &inputFormat, sizeof(chunkFormat)) != 0 ||
chunkConfig != inputChannelConfig || chunkLossless != inputLossless) {
formatRead = YES;
inputFormat = chunkFormat;
inputChannelConfig = chunkConfig;
inputLossless = chunkLossless;
BOOL isFloat = !!(inputFormat.mFormatFlags & kAudioFormatFlagIsFloat);
if((!isFloat && !(inputFormat.mBitsPerChannel >= 1 && inputFormat.mBitsPerChannel <= 32)) || (isFloat && !(inputFormat.mBitsPerChannel == 32 || inputFormat.mBitsPerChannel == 64))) {
inConverter = NO;
return [[AudioChunk alloc] init];
}
// These are really placeholders, as we're doing everything internally now
if(inputLossless &&
inputFormat.mBitsPerChannel == 16 &&
inputFormat.mChannelsPerFrame == 2 &&
inputFormat.mSampleRate == 44100) {
// possibly HDCD, run through decoder
[self addObservers];
if(hdcd_decoder) {
free(hdcd_decoder);
hdcd_decoder = NULL;
}
hdcd_decoder = calloc(1, sizeof(hdcd_state_stereo_t));
hdcd_reset_stereo((hdcd_state_stereo_t *)hdcd_decoder, 44100);
}
floatFormat = inputFormat;
floatFormat.mFormatFlags = kAudioFormatFlagsNativeFloatPacked;
floatFormat.mBitsPerChannel = 32;
floatFormat.mBytesPerFrame = (32 / 8) * floatFormat.mChannelsPerFrame;
floatFormat.mBytesPerPacket = floatFormat.mBytesPerFrame * floatFormat.mFramesPerPacket;
#if DSD_DECIMATE
if(inputFormat.mBitsPerChannel == 1) {
// Decimate this for speed
floatFormat.mSampleRate *= 1.0 / 8.0;
if(dsd2pcm && dsd2pcmCount) {
for(size_t i = 0; i < dsd2pcmCount; ++i) {
dsd2pcm_free(dsd2pcm[i]);
dsd2pcm[i] = NULL;
}
free(dsd2pcm);
dsd2pcm = NULL;
}
dsd2pcmCount = floatFormat.mChannelsPerFrame;
dsd2pcm = (void **)calloc(dsd2pcmCount, sizeof(void *));
dsd2pcm[0] = dsd2pcm_alloc();
dsd2pcmLatency = dsd2pcm_latency(dsd2pcm[0]);
for(size_t i = 1; i < dsd2pcmCount; ++i) {
dsd2pcm[i] = dsd2pcm_dup(dsd2pcm[0]);
}
}
#endif
}
NSUInteger samplesRead = [inChunk frameCount];
if(!samplesRead) {
inConverter = NO;
return [[AudioChunk alloc] init];
}
BOOL isFloat = !!(inputFormat.mFormatFlags & kAudioFormatFlagIsFloat);
BOOL isUnsigned = !isFloat && !(inputFormat.mFormatFlags & kAudioFormatFlagIsSignedInteger);
size_t bitsPerSample = inputFormat.mBitsPerChannel;
BOOL isBigEndian = !!(inputFormat.mFormatFlags & kAudioFormatFlagIsBigEndian);
double streamTimestamp = [inChunk streamTimestamp];
NSData *inputData = [inChunk removeSamples:samplesRead];
#if DSD_DECIMATE
const size_t sizeFactor = 3;
#else
const size_t sizeFactor = (bitsPerSample == 1) ? 9 : 3;
#endif
size_t newSize = samplesRead * floatFormat.mBytesPerPacket * sizeFactor + 64;
if(!tempData || tempDataSize < newSize)
tempData = realloc(tempData, tempDataSize = newSize); // Either two buffers plus padding, and/or double precision in case of endian flip
// double buffer system, with alignment
const size_t buffer_adder_base = (samplesRead * floatFormat.mBytesPerPacket + 31) & ~31;
NSUInteger bytesReadFromInput = samplesRead * inputFormat.mBytesPerPacket;
uint8_t *inputBuffer = (uint8_t *)[inputData bytes];
BOOL inputChanged = NO;
BOOL hdcdSustained = NO;
if(bytesReadFromInput && isBigEndian) {
// Time for endian swap!
memcpy(&tempData[0], [inputData bytes], bytesReadFromInput);
convert_be_to_le((uint8_t *)(&tempData[0]), inputFormat.mBitsPerChannel, bytesReadFromInput);
inputBuffer = &tempData[0];
inputChanged = YES;
}
if(bytesReadFromInput && isFloat && bitsPerSample == 64) {
// Time for precision loss from weird inputs
const size_t buffer_adder = (inputBuffer == &tempData[0]) ? buffer_adder_base * 2 : 0;
samplesRead = bytesReadFromInput / sizeof(double);
convert_f64_to_f32((float *)(&tempData[buffer_adder]), (const double *)inputBuffer, samplesRead);
bytesReadFromInput = samplesRead * sizeof(float);
inputBuffer = &tempData[buffer_adder];
inputChanged = YES;
bitsPerSample = 32;
}
if(bytesReadFromInput && !isFloat) {
float gain = 1.0;
if(bitsPerSample == 1) {
const size_t buffer_adder = (inputBuffer == &tempData[0]) ? buffer_adder_base : 0;
samplesRead = bytesReadFromInput / inputFormat.mBytesPerPacket;
convert_dsd_to_f32((float *)(&tempData[buffer_adder]), (const uint8_t *)inputBuffer, samplesRead, inputFormat.mChannelsPerFrame
#if DSD_DECIMATE
,
dsd2pcm
#endif
);
#if !DSD_DECIMATE
samplesRead *= 8;
#endif
bitsPerSample = 32;
bytesReadFromInput = samplesRead * floatFormat.mBytesPerPacket;
isFloat = YES;
inputBuffer = &tempData[buffer_adder];
inputChanged = YES;
[self addObservers];
#if DSD_DECIMATE
if(halveDSDVolume) {
float scaleFactor = 2.0f;
vDSP_vsdiv((float *)inputBuffer, 1, &scaleFactor, (float *)inputBuffer, 1, bytesReadFromInput / sizeof(float));
}
#else
if(!halveDSDVolume) {
float scaleFactor = 2.0f;
vDSP_vsmul((float *)inputBuffer, 1, &scaleFactor, (float *)inputBuffer, 1, bytesReadFromInput / sizeof(float));
}
#endif
} else if(bitsPerSample <= 8) {
samplesRead = bytesReadFromInput;
const size_t buffer_adder = (inputBuffer == &tempData[0]) ? buffer_adder_base : 0;
if(!isUnsigned)
convert_s8_to_s16((int16_t *)(&tempData[buffer_adder]), (const uint8_t *)inputBuffer, samplesRead);
else
convert_u8_to_s16((int16_t *)(&tempData[buffer_adder]), (const uint8_t *)inputBuffer, samplesRead);
bitsPerSample = 16;
bytesReadFromInput = samplesRead * 2;
isUnsigned = NO;
inputBuffer = &tempData[buffer_adder];
inputChanged = YES;
}
if(hdcd_decoder) { // implied bits per sample is 16, produces 32 bit int scale
samplesRead = bytesReadFromInput / 2;
const size_t buffer_adder = (inputBuffer == &tempData[0]) ? buffer_adder_base : 0;
if(isUnsigned) {
if(!inputChanged) {
memcpy(&tempData[buffer_adder], inputBuffer, samplesRead * 2);
inputBuffer = &tempData[buffer_adder];
inputChanged = YES;
}
convert_u16_to_s16((int16_t *)inputBuffer, samplesRead);
isUnsigned = NO;
}
const size_t buffer_adder2 = (inputBuffer == &tempData[0]) ? buffer_adder_base : 0;
convert_s16_to_hdcd_input((int32_t *)(&tempData[buffer_adder2]), (int16_t *)inputBuffer, samplesRead);
hdcd_process_stereo((hdcd_state_stereo_t *)hdcd_decoder, (int32_t *)(&tempData[buffer_adder2]), (int)(samplesRead / 2));
if(((hdcd_state_stereo_t *)hdcd_decoder)->channel[0].sustain &&
((hdcd_state_stereo_t *)hdcd_decoder)->channel[1].sustain) {
hdcdSustained = YES;
}
if(enableHDCD) {
gain = 2.0;
bitsPerSample = 32;
bytesReadFromInput = samplesRead * 4;
isUnsigned = NO;
inputBuffer = &tempData[buffer_adder2];
inputChanged = YES;
} else {
// Discard the output of the decoder and process again
goto process16bit;
}
} else if(bitsPerSample <= 16) {
process16bit:
samplesRead = bytesReadFromInput / 2;
const size_t buffer_adder = (inputBuffer == &tempData[0]) ? buffer_adder_base : 0;
if(isUnsigned) {
if(!inputChanged) {
memcpy(&tempData[buffer_adder], inputBuffer, samplesRead * 2);
inputBuffer = &tempData[buffer_adder];
inputChanged = YES;
}
convert_u16_to_s16((int16_t *)inputBuffer, samplesRead);
}
const size_t buffer_adder2 = (inputBuffer == &tempData[0]) ? buffer_adder_base : 0;
vDSP_vflt16((const short *)inputBuffer, 1, (float *)(&tempData[buffer_adder2]), 1, samplesRead);
float scale = 1ULL << 15;
vDSP_vsdiv((const float *)(&tempData[buffer_adder2]), 1, &scale, (float *)(&tempData[buffer_adder2]), 1, samplesRead);
bitsPerSample = 32;
bytesReadFromInput = samplesRead * sizeof(float);
isUnsigned = NO;
isFloat = YES;
inputBuffer = &tempData[buffer_adder2];
inputChanged = YES;
} else if(bitsPerSample <= 24) {
const size_t buffer_adder = (inputBuffer == &tempData[0]) ? buffer_adder_base : 0;
samplesRead = bytesReadFromInput / 3;
if(isUnsigned)
convert_u24_to_s32((int32_t *)(&tempData[buffer_adder]), (uint8_t *)inputBuffer, samplesRead);
else
convert_s24_to_s32((int32_t *)(&tempData[buffer_adder]), (uint8_t *)inputBuffer, samplesRead);
bitsPerSample = 32;
bytesReadFromInput = samplesRead * 4;
isUnsigned = NO;
inputBuffer = &tempData[buffer_adder];
inputChanged = YES;
}
if(!isFloat && bitsPerSample <= 32) {
samplesRead = bytesReadFromInput / 4;
if(isUnsigned) {
if(!inputChanged) {
memcpy(&tempData[0], inputBuffer, bytesReadFromInput);
inputBuffer = &tempData[0];
}
convert_u32_to_s32((int32_t *)inputBuffer, samplesRead);
}
const size_t buffer_adder = (inputBuffer == &tempData[0]) ? buffer_adder_base : 0; // vDSP functions expect aligned to four elements
vDSP_vflt32((const int *)inputBuffer, 1, (float *)(&tempData[buffer_adder]), 1, samplesRead);
float scale = (1ULL << 31) / gain;
vDSP_vsdiv((const float *)(&tempData[buffer_adder]), 1, &scale, (float *)(&tempData[buffer_adder]), 1, samplesRead);
bitsPerSample = 32;
bytesReadFromInput = samplesRead * sizeof(float);
isUnsigned = NO;
isFloat = YES;
inputBuffer = &tempData[buffer_adder];
}
#ifdef _DEBUG
[BadSampleCleaner cleanSamples:(float *)inputBuffer
amount:bytesReadFromInput / sizeof(float)
location:@"post int to float conversion"];
#endif
}
AudioChunk *outChunk = [[AudioChunk alloc] init];
[outChunk setFormat:floatFormat];
[outChunk setChannelConfig:inputChannelConfig];
[outChunk setLossless:inputLossless];
[outChunk setStreamTimestamp:streamTimestamp];
[outChunk setStreamTimeRatio:[inChunk streamTimeRatio]];
if(hdcdSustained) [outChunk setHDCD];
[outChunk assignSamples:inputBuffer frameCount:bytesReadFromInput / floatFormat.mBytesPerPacket];
inConverter = NO;
return outChunk;
}
- (BOOL)peekFormat:(AudioStreamBasicDescription *)format channelConfig:(uint32_t *)config {
if(stopping) return NO;
inPeeker = YES;
@synchronized(chunkList) {
if([chunkList count]) {
AudioChunk *chunk = [chunkList objectAtIndex:0];
*format = [chunk format];
*config = [chunk channelConfig];
inPeeker = NO;
return YES;
}
}
inPeeker = NO;
return NO;
}
- (BOOL)peekTimestamp:(double *)timestamp timeRatio:(double *)timeRatio {
if(stopping) return NO;
inPeeker = YES;
@synchronized (chunkList) {
if([chunkList count]) {
AudioChunk *chunk = [chunkList objectAtIndex:0];
*timestamp = [chunk streamTimestamp];
*timeRatio = [chunk streamTimeRatio];
inPeeker = NO;
return YES;
}
}
*timestamp = 0.0;
*timeRatio = 1.0;
inPeeker = NO;
return NO;
}

View file

@ -12,78 +12,62 @@
#import <AudioUnit/AudioUnit.h>
#import <CoreAudio/AudioHardware.h>
#import <CogAudio/soxr.h>
#import "Node.h"
#import <CogAudio/Node.h>
#define DSD_DECIMATE 1
@interface ConverterNode : Node {
NSDictionary *rgInfo;
soxr_t soxr;
void *inputBuffer;
size_t inputBufferSize;
size_t inpSize, inpOffset;
double streamTimestamp, streamTimeRatio;
BOOL stopping;
BOOL convertEntered;
BOOL paused;
BOOL skipResampler;
unsigned int PRIME_LEN_;
unsigned int N_samples_to_add_;
unsigned int N_samples_to_drop_;
BOOL is_preextrapolated_;
int is_postextrapolated_;
int latencyEaten;
int latencyEatenPost;
double sampleRatio;
BOOL observersAdded;
float volumeScale;
void *floatBuffer;
size_t floatBufferSize;
size_t floatSize, floatOffset;
void *extrapolateBuffer;
size_t extrapolateBufferSize;
#if DSD_DECIMATE
void **dsd2pcm;
size_t dsd2pcmCount;
int dsd2pcmLatency;
#endif
BOOL rememberedLossless;
AudioStreamBasicDescription inputFormat;
AudioStreamBasicDescription floatFormat;
AudioStreamBasicDescription outputFormat;
AudioStreamBasicDescription dmFloatFormat; // downmixed/upmixed float format
uint32_t inputChannelConfig;
BOOL streamFormatChanged;
AudioStreamBasicDescription newInputFormat;
uint32_t newInputChannelConfig;
AudioChunk *lastChunkIn;
void *hdcd_decoder;
}
@property AudioStreamBasicDescription inputFormat;
- (id)initWithController:(id)c previous:(id)p;
- (BOOL)setupWithInputFormat:(AudioStreamBasicDescription)inputFormat withInputConfig:(uint32_t)inputConfig outputFormat:(AudioStreamBasicDescription)outputFormat isLossless:(BOOL)lossless;
- (BOOL)setupWithInputFormat:(AudioStreamBasicDescription)inputFormat withInputConfig:(uint32_t)inputConfig isLossless:(BOOL)lossless;
- (void)cleanUp;
- (BOOL)paused;
- (void)process;
- (AudioChunk *)convert;
- (int)convert:(void *)dest amount:(int)amount;
- (void)setRGInfo:(NSDictionary *)rgi;
- (void)setOutputFormat:(AudioStreamBasicDescription)outputFormat;
- (void)inputFormatDidChange:(AudioStreamBasicDescription)format inputConfig:(uint32_t)inputConfig;
- (void)refreshVolumeScaling;

View file

@ -16,13 +16,16 @@
#import "Logging.h"
#import "lpc.h"
#import "util.h"
#import "hdcd_decode2.h"
#ifdef _DEBUG
#import "BadSampleCleaner.h"
#endif
#if !DSD_DECIMATE
#include "dsd2float.h"
#endif
void PrintStreamDesc(AudioStreamBasicDescription *inDesc) {
if(!inDesc) {
DLog(@"Can't print a NULL desc!\n");
@ -51,7 +54,6 @@ static void *kConverterNodeContext = &kConverterNodeContext;
if(self) {
rgInfo = nil;
soxr = 0;
inputBuffer = NULL;
inputBufferSize = 0;
floatBuffer = NULL;
@ -61,123 +63,446 @@ static void *kConverterNodeContext = &kConverterNodeContext;
convertEntered = NO;
paused = NO;
skipResampler = YES;
extrapolateBuffer = NULL;
extrapolateBufferSize = 0;
#ifdef LOG_CHAINS
[self initLogFiles];
#if DSD_DECIMATE
dsd2pcm = NULL;
dsd2pcmCount = 0;
#endif
hdcd_decoder = NULL;
lastChunkIn = nil;
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.volumeScaling" options:0 context:kConverterNodeContext];
}
return self;
}
- (void)addObservers {
if(!observersAdded) {
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.volumeScaling" options:(NSKeyValueObservingOptionInitial|NSKeyValueObservingOptionNew) context:kConverterNodeContext];
observersAdded = YES;
}
}
- (void)removeObservers {
if(observersAdded) {
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.volumeScaling" context:kConverterNodeContext];
observersAdded = NO;
}
}
void scale_by_volume(float *buffer, size_t count, float volume) {
if(volume != 1.0) {
size_t unaligned = (uintptr_t)buffer & 15;
if(unaligned) {
size_t count_unaligned = (16 - unaligned) / sizeof(float);
while(count > 0 && count_unaligned > 0) {
size_t count3 = unaligned >> 2;
while(count3 > 0) {
*buffer++ *= volume;
count_unaligned--;
count3--;
count--;
}
}
if(count) {
vDSP_vsmul(buffer, 1, &volume, buffer, 1, count);
}
}
}
- (BOOL)paused {
return paused;
#if DSD_DECIMATE
/**
* DSD 2 PCM: Stage 1:
* Decimate by factor 8
* (one byte (8 samples) -> one float sample)
* The bits are processed from least signicifant to most signicicant.
* @author Sebastian Gesemann
*/
#define dsd2pcm_FILTER_COEFFS_COUNT 64
static const float dsd2pcm_FILTER_COEFFS[64] = {
0.09712411121659f, 0.09613438994044f, 0.09417884216316f, 0.09130441727307f,
0.08757947648990f, 0.08309142055179f, 0.07794369263673f, 0.07225228745463f,
0.06614191680338f, 0.05974199351302f, 0.05318259916599f, 0.04659059631228f,
0.04008603356890f, 0.03377897290478f, 0.02776684382775f, 0.02213240062966f,
0.01694232798846f, 0.01224650881275f, 0.00807793792573f, 0.00445323755944f,
0.00137370697215f, -0.00117318019994f, -0.00321193033831f, -0.00477694265140f,
-0.00591028841335f, -0.00665946056286f, -0.00707518873201f, -0.00720940203988f,
-0.00711340642819f, -0.00683632603227f, -0.00642384017266f, -0.00591723006715f,
-0.00535273320457f, -0.00476118922548f, -0.00416794965654f, -0.00359301524813f,
-0.00305135909510f, -0.00255339111833f, -0.00210551956895f, -0.00171076760278f,
-0.00136940723130f, -0.00107957856005f, -0.00083786862365f, -0.00063983084245f,
-0.00048043272086f, -0.00035442550015f, -0.00025663481039f, -0.00018217573430f,
-0.00012659899635f, -0.00008597726991f, -0.00005694188820f, -0.00003668060332f,
-0.00002290670286f, -0.00001380895679f, -0.00000799057558f, -0.00000440385083f,
-0.00000228567089f, -0.00000109760778f, -0.00000047286430f, -0.00000017129652f,
-0.00000004282776f, 0.00000000119422f, 0.00000000949179f, 0.00000000747450f
};
struct dsd2pcm_state {
/*
* This is the 2nd half of an even order symmetric FIR
* lowpass filter (to be used on a signal sampled at 44100*64 Hz)
* Passband is 0-24 kHz (ripples +/- 0.025 dB)
* Stopband starts at 176.4 kHz (rejection: 170 dB)
* The overall gain is 2.0
*/
/* These remain constant for the duration */
int FILT_LOOKUP_PARTS;
float *FILT_LOOKUP_TABLE;
uint8_t *REVERSE_BITS;
int FIFO_LENGTH;
int FIFO_OFS_MASK;
/* These are altered */
int *fifo;
int fpos;
};
static void dsd2pcm_free(void *);
static void dsd2pcm_reset(void *);
static void *dsd2pcm_alloc() {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)calloc(1, sizeof(struct dsd2pcm_state));
float *FILT_LOOKUP_TABLE;
double *temp;
uint8_t *REVERSE_BITS;
if(!state)
return NULL;
state->FILT_LOOKUP_PARTS = (dsd2pcm_FILTER_COEFFS_COUNT + 7) / 8;
const int FILT_LOOKUP_PARTS = state->FILT_LOOKUP_PARTS;
// The current 128 tap FIR leads to an 8 KB lookup table
state->FILT_LOOKUP_TABLE = (float *)calloc(sizeof(float), FILT_LOOKUP_PARTS << 8);
if(!state->FILT_LOOKUP_TABLE)
goto fail;
FILT_LOOKUP_TABLE = state->FILT_LOOKUP_TABLE;
temp = (double *)calloc(sizeof(double), 0x100);
if(!temp)
goto fail;
for(int part = 0, sofs = 0, dofs = 0; part < FILT_LOOKUP_PARTS;) {
memset(temp, 0, 0x100 * sizeof(double));
for(int bit = 0, bitmask = 0x80; bit < 8 && sofs + bit < dsd2pcm_FILTER_COEFFS_COUNT;) {
double coeff = dsd2pcm_FILTER_COEFFS[sofs + bit];
for(int bite = 0; bite < 0x100; bite++) {
if((bite & bitmask) == 0) {
temp[bite] -= coeff;
} else {
temp[bite] += coeff;
}
}
bit++;
bitmask >>= 1;
}
for(int s = 0; s < 0x100;) {
FILT_LOOKUP_TABLE[dofs++] = (float)temp[s++];
}
part++;
sofs += 8;
}
free(temp);
{ // calculate FIFO stuff
int k = 1;
while(k < FILT_LOOKUP_PARTS * 2) k <<= 1;
state->FIFO_LENGTH = k;
state->FIFO_OFS_MASK = k - 1;
}
state->REVERSE_BITS = (uint8_t *)calloc(1, 0x100);
if(!state->REVERSE_BITS)
goto fail;
REVERSE_BITS = state->REVERSE_BITS;
for(int i = 0, j = 0; i < 0x100; i++) {
REVERSE_BITS[i] = (uint8_t)j;
// "reverse-increment" of j
for(int bitmask = 0x80;;) {
if(((j ^= bitmask) & bitmask) != 0) break;
if(bitmask == 1) break;
bitmask >>= 1;
}
}
state->fifo = (int *)calloc(sizeof(int), state->FIFO_LENGTH);
if(!state->fifo)
goto fail;
dsd2pcm_reset(state);
return (void *)state;
fail:
dsd2pcm_free(state);
return NULL;
}
static void *dsd2pcm_dup(void *_state) {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)_state;
if(state) {
struct dsd2pcm_state *newstate = (struct dsd2pcm_state *)calloc(1, sizeof(struct dsd2pcm_state));
if(newstate) {
newstate->FILT_LOOKUP_PARTS = state->FILT_LOOKUP_PARTS;
newstate->FIFO_LENGTH = state->FIFO_LENGTH;
newstate->FIFO_OFS_MASK = state->FIFO_OFS_MASK;
newstate->fpos = state->fpos;
newstate->FILT_LOOKUP_TABLE = (float *)calloc(sizeof(float), state->FILT_LOOKUP_PARTS << 8);
if(!newstate->FILT_LOOKUP_TABLE)
goto fail;
memcpy(newstate->FILT_LOOKUP_TABLE, state->FILT_LOOKUP_TABLE, sizeof(float) * (state->FILT_LOOKUP_PARTS << 8));
newstate->REVERSE_BITS = (uint8_t *)calloc(1, 0x100);
if(!newstate->REVERSE_BITS)
goto fail;
memcpy(newstate->REVERSE_BITS, state->REVERSE_BITS, 0x100);
newstate->fifo = (int *)calloc(sizeof(int), state->FIFO_LENGTH);
if(!newstate->fifo)
goto fail;
memcpy(newstate->fifo, state->fifo, sizeof(int) * state->FIFO_LENGTH);
return (void *)newstate;
}
fail:
dsd2pcm_free(newstate);
return NULL;
}
return NULL;
}
static void dsd2pcm_free(void *_state) {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)_state;
if(state) {
free(state->fifo);
free(state->REVERSE_BITS);
free(state->FILT_LOOKUP_TABLE);
free(state);
}
}
static void dsd2pcm_reset(void *_state) {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)_state;
const int FILT_LOOKUP_PARTS = state->FILT_LOOKUP_PARTS;
int *fifo = state->fifo;
for(int i = 0; i < FILT_LOOKUP_PARTS; i++) {
fifo[i] = 0x55;
fifo[i + FILT_LOOKUP_PARTS] = 0xAA;
}
state->fpos = FILT_LOOKUP_PARTS;
}
static int dsd2pcm_latency(void *_state) {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)_state;
if(state)
return state->FIFO_LENGTH;
else
return 0;
}
static void dsd2pcm_process(void *_state, const uint8_t *src, size_t sofs, size_t sinc, float *dest, size_t dofs, size_t dinc, size_t len) {
struct dsd2pcm_state *state = (struct dsd2pcm_state *)_state;
int bite1, bite2, temp;
float sample;
int *fifo = state->fifo;
const uint8_t *REVERSE_BITS = state->REVERSE_BITS;
const float *FILT_LOOKUP_TABLE = state->FILT_LOOKUP_TABLE;
const int FILT_LOOKUP_PARTS = state->FILT_LOOKUP_PARTS;
const int FIFO_OFS_MASK = state->FIFO_OFS_MASK;
int fpos = state->fpos;
while(len > 0) {
fifo[fpos] = REVERSE_BITS[fifo[fpos]] & 0xFF;
fifo[(fpos + FILT_LOOKUP_PARTS) & FIFO_OFS_MASK] = src[sofs] & 0xFF;
sofs += sinc;
temp = (fpos + 1) & FIFO_OFS_MASK;
sample = 0;
for(int k = 0, lofs = 0; k < FILT_LOOKUP_PARTS;) {
bite1 = fifo[(fpos - k) & FIFO_OFS_MASK];
bite2 = fifo[(temp + k) & FIFO_OFS_MASK];
sample += FILT_LOOKUP_TABLE[lofs + bite1] + FILT_LOOKUP_TABLE[lofs + bite2];
k++;
lofs += 0x100;
}
fpos = temp;
dest[dofs] = sample;
dofs += dinc;
len--;
}
state->fpos = fpos;
}
static void convert_dsd_to_f32(float *output, const uint8_t *input, size_t count, size_t channels, void **dsd2pcm) {
for(size_t channel = 0; channel < channels; ++channel) {
dsd2pcm_process(dsd2pcm[channel], input, channel, channels, output, channel, channels, count);
}
}
#else
static void convert_dsd_to_f32(float *output, const uint8_t *input, size_t count, size_t channels) {
const uint8_t *iptr = input;
float *optr = output;
for(size_t index = 0; index < count; ++index) {
for(size_t channel = 0; channel < channels; ++channel) {
uint8_t sample = *iptr++;
cblas_scopy(8, &dsd2float[sample][0], 1, optr++, (int)channels);
}
optr += channels * 7;
}
}
#endif
static void convert_u8_to_s16(int16_t *output, const uint8_t *input, size_t count) {
for(size_t i = 0; i < count; ++i) {
uint16_t sample = (input[i] << 8) | input[i];
sample ^= 0x8080;
output[i] = (int16_t)(sample);
}
}
static void convert_s8_to_s16(int16_t *output, const uint8_t *input, size_t count) {
for(size_t i = 0; i < count; ++i) {
uint16_t sample = (input[i] << 8) | input[i];
output[i] = (int16_t)(sample);
}
}
static void convert_u16_to_s16(int16_t *buffer, size_t count) {
for(size_t i = 0; i < count; ++i) {
buffer[i] ^= 0x8000;
}
}
static void convert_s16_to_hdcd_input(int32_t *output, const int16_t *input, size_t count) {
for(size_t i = 0; i < count; ++i) {
output[i] = input[i];
}
}
static void convert_s24_to_s32(int32_t *output, const uint8_t *input, size_t count) {
for(size_t i = 0; i < count; ++i) {
int32_t sample = (input[i * 3] << 8) | (input[i * 3 + 1] << 16) | (input[i * 3 + 2] << 24);
output[i] = sample;
}
}
static void convert_u24_to_s32(int32_t *output, const uint8_t *input, size_t count) {
for(size_t i = 0; i < count; ++i) {
int32_t sample = (input[i * 3] << 8) | (input[i * 3 + 1] << 16) | (input[i * 3 + 2] << 24);
output[i] = sample ^ 0x80000000;
}
}
static void convert_u32_to_s32(int32_t *buffer, size_t count) {
for(size_t i = 0; i < count; ++i) {
buffer[i] ^= 0x80000000;
}
}
static void convert_f64_to_f32(float *output, const double *input, size_t count) {
vDSP_vdpsp(input, 1, output, 1, count);
}
static void convert_be_to_le(uint8_t *buffer, size_t bitsPerSample, size_t bytes) {
size_t i;
bitsPerSample = (bitsPerSample + 7) / 8;
switch(bitsPerSample) {
case 2:
for(i = 0; i < bytes; i += 2) {
*(int16_t *)buffer = __builtin_bswap16(*(int16_t *)buffer);
buffer += 2;
}
break;
case 3: {
union {
vDSP_int24 int24;
uint32_t int32;
} intval;
intval.int32 = 0;
for(i = 0; i < bytes; i += 3) {
intval.int24 = *(vDSP_int24 *)buffer;
intval.int32 = __builtin_bswap32(intval.int32 << 8);
*(vDSP_int24 *)buffer = intval.int24;
buffer += 3;
}
} break;
case 4:
for(i = 0; i < bytes; i += 4) {
*(uint32_t *)buffer = __builtin_bswap32(*(uint32_t *)buffer);
buffer += 4;
}
break;
case 8:
for(i = 0; i < bytes; i += 8) {
*(uint64_t *)buffer = __builtin_bswap64(*(uint64_t *)buffer);
buffer += 8;
}
break;
}
}
- (void)process {
char writeBuf[CHUNK_SIZE];
// Removed endOfStream check from here, since we want to be able to flush the converter
// when the end of stream is reached. Convert function instead processes what it can,
// and returns 0 samples when it has nothing more to process at the end of stream.
while([self shouldContinue] == YES) {
int amountConverted;
while(paused) {
usleep(500);
}
@autoreleasepool {
AudioChunk *chunk = nil;
chunk = [self convert];
if(!chunk || ![chunk frameCount]) {
if([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) {
endOfStream = YES;
amountConverted = [self convert:writeBuf amount:CHUNK_SIZE];
}
if(!amountConverted) {
if(paused) {
continue;
} else if(streamFormatChanged) {
[self cleanUp];
[self setupWithInputFormat:newInputFormat withInputConfig:newInputChannelConfig isLossless:rememberedLossless];
continue;
} else
break;
}
if(paused || !streamFormatChanged) {
continue;
[self writeData:writeBuf amount:amountConverted];
}
usleep(500);
} else {
[self writeChunk:chunk];
chunk = nil;
}
if(streamFormatChanged) {
[self cleanUp];
[self setupWithInputFormat:newInputFormat withInputConfig:newInputChannelConfig outputFormat:self->outputFormat isLossless:rememberedLossless];
}
}
}
endOfStream = YES;
}
- (AudioChunk *)convert {
- (int)convert:(void *)dest amount:(int)amount {
UInt32 ioNumberPackets;
int amountReadFromFC;
int amountRead = 0;
if(stopping)
return 0;
convertEntered = YES;
tryagain:
if(stopping || [self shouldContinue] == NO) {
convertEntered = NO;
return nil;
return amountRead;
}
if(inpOffset == inpSize) {
streamTimestamp = 0.0;
streamTimeRatio = 1.0;
if(![self peekTimestamp:&streamTimestamp timeRatio:&streamTimeRatio]) {
convertEntered = NO;
return nil;
}
}
amountReadFromFC = 0;
if(floatOffset == floatSize) // skip this step if there's still float buffered
while(inpOffset == inpSize) {
size_t samplesRead = 0;
BOOL isFloat = !!(inputFormat.mFormatFlags & kAudioFormatFlagIsFloat);
BOOL isUnsigned = !isFloat && !(inputFormat.mFormatFlags & kAudioFormatFlagIsSignedInteger);
size_t bitsPerSample = inputFormat.mBitsPerChannel;
// Approximately the most we want on input
ioNumberPackets = 4096;
ioNumberPackets = CHUNK_SIZE;
#if DSD_DECIMATE
const size_t sizeScale = 3;
#else
const size_t sizeScale = (bitsPerSample == 1) ? 10 : 3;
#endif
size_t newSize = ioNumberPackets * floatFormat.mBytesPerPacket;
if(!inputBuffer || inputBufferSize < newSize)
inputBuffer = realloc(inputBuffer, inputBufferSize = newSize);
inputBuffer = realloc(inputBuffer, inputBufferSize = newSize * sizeScale);
ssize_t amountToWrite = ioNumberPackets * floatFormat.mBytesPerPacket;
ssize_t amountToWrite = ioNumberPackets * inputFormat.mBytesPerPacket;
ssize_t bytesReadFromInput = 0;
while(bytesReadFromInput < amountToWrite && !stopping && !paused && !streamFormatChanged && [self shouldContinue] == YES && !([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES)) {
while(bytesReadFromInput < amountToWrite && !stopping && !paused && !streamFormatChanged && [self shouldContinue] == YES && [self endOfStream] == NO) {
AudioStreamBasicDescription inf;
uint32_t config;
if([self peekFormat:&inf channelConfig:&config]) {
@ -194,7 +519,7 @@ void scale_by_volume(float *buffer, size_t count, float volume) {
}
}
AudioChunk *chunk = [self readChunkAsFloat32:((amountToWrite - bytesReadFromInput) / floatFormat.mBytesPerPacket)];
AudioChunk *chunk = [self readChunk:((amountToWrite - bytesReadFromInput) / inputFormat.mBytesPerPacket)];
inf = [chunk format];
size_t frameCount = [chunk frameCount];
config = [chunk channelConfig];
@ -202,9 +527,11 @@ void scale_by_volume(float *buffer, size_t count, float volume) {
if(frameCount) {
NSData *samples = [chunk removeSamples:frameCount];
memcpy(((uint8_t *)inputBuffer) + bytesReadFromInput, [samples bytes], bytesRead);
if([chunk isHDCD]) {
[controller sustainHDCD];
}
lastChunkIn = [[AudioChunk alloc] init];
[lastChunkIn setFormat:inf];
[lastChunkIn setChannelConfig:config];
[lastChunkIn setLossless:[chunk lossless]];
[lastChunkIn assignSamples:[samples bytes] frameCount:frameCount];
}
bytesReadFromInput += bytesRead;
if(!frameCount) {
@ -212,60 +539,117 @@ void scale_by_volume(float *buffer, size_t count, float volume) {
}
}
BOOL isBigEndian = !!(inputFormat.mFormatFlags & kAudioFormatFlagIsBigEndian);
if(!bytesReadFromInput) {
convertEntered = NO;
return nil;
return amountRead;
}
if(stopping || paused || streamFormatChanged || [self shouldContinue] == NO || ([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES)) {
if(!skipResampler) {
if(!is_postextrapolated_) {
is_postextrapolated_ = 1;
}
} else {
is_postextrapolated_ = 3;
}
if(bytesReadFromInput && isBigEndian) {
// Time for endian swap!
convert_be_to_le((uint8_t *)inputBuffer, inputFormat.mBitsPerChannel, bytesReadFromInput);
}
// Extrapolate start
if(!skipResampler && !is_preextrapolated_) {
size_t inputSamples = bytesReadFromInput / floatFormat.mBytesPerPacket;
size_t prime = MIN(inputSamples, PRIME_LEN_);
size_t _N_samples_to_add_ = N_samples_to_add_;
size_t newSize = _N_samples_to_add_ * floatFormat.mBytesPerPacket;
newSize += bytesReadFromInput;
if(newSize > inputBufferSize) {
inputBuffer = realloc(inputBuffer, inputBufferSize = newSize * 3);
if(bytesReadFromInput && isFloat && inputFormat.mBitsPerChannel == 64) {
// Time for precision loss from weird inputs
samplesRead = bytesReadFromInput / sizeof(double);
convert_f64_to_f32((float *)(((uint8_t *)inputBuffer) + bytesReadFromInput), (const double *)inputBuffer, samplesRead);
memmove(inputBuffer, ((uint8_t *)inputBuffer) + bytesReadFromInput, samplesRead * sizeof(float));
bytesReadFromInput = samplesRead * sizeof(float);
}
memmove(inputBuffer + _N_samples_to_add_ * floatFormat.mBytesPerPacket, inputBuffer, bytesReadFromInput);
lpc_extrapolate_bkwd(inputBuffer + _N_samples_to_add_ * floatFormat.mBytesPerPacket, inputSamples, prime, floatFormat.mChannelsPerFrame, LPC_ORDER, _N_samples_to_add_, &extrapolateBuffer, &extrapolateBufferSize);
bytesReadFromInput += _N_samples_to_add_ * floatFormat.mBytesPerPacket;
latencyEaten = N_samples_to_drop_;
is_preextrapolated_ = YES;
if(bytesReadFromInput && !isFloat) {
float gain = 1.0;
if(bitsPerSample == 1) {
samplesRead = bytesReadFromInput / inputFormat.mBytesPerPacket;
size_t buffer_adder = (bytesReadFromInput + 15) & ~15;
convert_dsd_to_f32((float *)(((uint8_t *)inputBuffer) + buffer_adder), (const uint8_t *)inputBuffer, samplesRead, inputFormat.mChannelsPerFrame
#if DSD_DECIMATE
,
dsd2pcm
#endif
);
#if !DSD_DECIMATE
samplesRead *= 8;
#endif
memmove(inputBuffer, ((const uint8_t *)inputBuffer) + buffer_adder, samplesRead * inputFormat.mChannelsPerFrame * sizeof(float));
bitsPerSample = 32;
bytesReadFromInput = samplesRead * inputFormat.mChannelsPerFrame * sizeof(float);
isFloat = YES;
} else if(bitsPerSample <= 8) {
samplesRead = bytesReadFromInput;
size_t buffer_adder = (bytesReadFromInput + 1) & ~1;
if(!isUnsigned)
convert_s8_to_s16((int16_t *)(((uint8_t *)inputBuffer) + buffer_adder), (const uint8_t *)inputBuffer, samplesRead);
else
convert_u8_to_s16((int16_t *)(((uint8_t *)inputBuffer) + buffer_adder), (const uint8_t *)inputBuffer, samplesRead);
memmove(inputBuffer, ((uint8_t *)inputBuffer) + buffer_adder, samplesRead * 2);
bitsPerSample = 16;
bytesReadFromInput = samplesRead * 2;
isUnsigned = NO;
}
if(hdcd_decoder) { // implied bits per sample is 16, produces 32 bit int scale
samplesRead = bytesReadFromInput / 2;
if(isUnsigned)
convert_u16_to_s16((int16_t *)inputBuffer, samplesRead);
size_t buffer_adder = (bytesReadFromInput + 3) & ~3;
convert_s16_to_hdcd_input((int32_t *)(((uint8_t *)inputBuffer) + buffer_adder), (int16_t *)inputBuffer, samplesRead);
memmove(inputBuffer, ((uint8_t *)inputBuffer) + buffer_adder, samplesRead * 4);
hdcd_process_stereo((hdcd_state_stereo_t *)hdcd_decoder, (int32_t *)inputBuffer, (int)(samplesRead / 2));
if(((hdcd_state_stereo_t *)hdcd_decoder)->channel[0].sustain &&
((hdcd_state_stereo_t *)hdcd_decoder)->channel[1].sustain) {
[controller sustainHDCD];
}
gain = 2.0;
bitsPerSample = 32;
bytesReadFromInput = samplesRead * 4;
isUnsigned = NO;
} else if(bitsPerSample <= 16) {
samplesRead = bytesReadFromInput / 2;
if(isUnsigned)
convert_u16_to_s16((int16_t *)inputBuffer, samplesRead);
size_t buffer_adder = (bytesReadFromInput + 15) & ~15; // vDSP functions expect aligned to four elements
vDSP_vflt16((const short *)inputBuffer, 1, (float *)(((uint8_t *)inputBuffer) + buffer_adder), 1, samplesRead);
float scale = 1ULL << 15;
vDSP_vsdiv((const float *)(((uint8_t *)inputBuffer) + buffer_adder), 1, &scale, (float *)(((uint8_t *)inputBuffer) + buffer_adder), 1, samplesRead);
memmove(inputBuffer, ((uint8_t *)inputBuffer) + buffer_adder, samplesRead * sizeof(float));
bitsPerSample = 32;
bytesReadFromInput = samplesRead * sizeof(float);
isUnsigned = NO;
isFloat = YES;
} else if(bitsPerSample <= 24) {
samplesRead = bytesReadFromInput / 3;
size_t buffer_adder = (bytesReadFromInput + 3) & ~3;
if(isUnsigned)
convert_u24_to_s32((int32_t *)(((uint8_t *)inputBuffer) + buffer_adder), (uint8_t *)inputBuffer, samplesRead);
else
convert_s24_to_s32((int32_t *)(((uint8_t *)inputBuffer) + buffer_adder), (uint8_t *)inputBuffer, samplesRead);
memmove(inputBuffer, ((uint8_t *)inputBuffer) + buffer_adder, samplesRead * 4);
bitsPerSample = 32;
bytesReadFromInput = samplesRead * 4;
isUnsigned = NO;
}
if(!isFloat && bitsPerSample <= 32) {
samplesRead = bytesReadFromInput / 4;
if(isUnsigned)
convert_u32_to_s32((int32_t *)inputBuffer, samplesRead);
size_t buffer_adder = (bytesReadFromInput + 31) & ~31; // vDSP functions expect aligned to four elements
vDSP_vflt32((const int *)inputBuffer, 1, (float *)(((uint8_t *)inputBuffer) + buffer_adder), 1, samplesRead);
float scale = (1ULL << 31) / gain;
vDSP_vsdiv((const float *)(((uint8_t *)inputBuffer) + buffer_adder), 1, &scale, (float *)(((uint8_t *)inputBuffer) + buffer_adder), 1, samplesRead);
memmove(inputBuffer, ((uint8_t *)inputBuffer) + buffer_adder, samplesRead * sizeof(float));
bitsPerSample = 32;
bytesReadFromInput = samplesRead * sizeof(float);
isUnsigned = NO;
isFloat = YES;
}
if(is_postextrapolated_ == 1) {
size_t inputSamples = bytesReadFromInput / floatFormat.mBytesPerPacket;
size_t prime = MIN(inputSamples, PRIME_LEN_);
size_t _N_samples_to_add_ = N_samples_to_add_;
size_t newSize = bytesReadFromInput;
newSize += _N_samples_to_add_ * floatFormat.mBytesPerPacket;
if(newSize > inputBufferSize) {
inputBuffer = realloc(inputBuffer, inputBufferSize = newSize * 3);
}
lpc_extrapolate_fwd(inputBuffer, inputSamples, prime, floatFormat.mChannelsPerFrame, LPC_ORDER, _N_samples_to_add_, &extrapolateBuffer, &extrapolateBufferSize);
bytesReadFromInput += _N_samples_to_add_ * floatFormat.mBytesPerPacket;
latencyEatenPost = N_samples_to_drop_;
is_postextrapolated_ = 2;
} else if(is_postextrapolated_ == 3) {
latencyEatenPost = 0;
#ifdef _DEBUG
[BadSampleCleaner cleanSamples:(float *)inputBuffer
amount:bytesReadFromInput / sizeof(float)
location:@"post int to float conversion"];
#endif
}
// Input now contains bytesReadFromInput worth of floats, in the input sample rate
@ -273,91 +657,65 @@ void scale_by_volume(float *buffer, size_t count, float volume) {
inpOffset = 0;
}
ioNumberPackets = (UInt32)(inpSize - inpOffset);
if(inpOffset != inpSize && floatOffset == floatSize) {
#if DSD_DECIMATE
const float scaleModifier = (inputFormat.mBitsPerChannel == 1) ? 0.5f : 1.0f;
#endif
ioNumberPackets -= ioNumberPackets % floatFormat.mBytesPerPacket;
size_t inputSamples = (inpSize - inpOffset) / floatFormat.mBytesPerPacket;
if(ioNumberPackets) {
size_t inputSamples = ioNumberPackets / floatFormat.mBytesPerPacket;
ioNumberPackets = (UInt32)inputSamples;
ioNumberPackets = (UInt32)ceil((float)ioNumberPackets * sampleRatio);
ioNumberPackets += soxr_delay(soxr);
ioNumberPackets = (ioNumberPackets + 255) & ~255;
size_t newSize = ioNumberPackets * floatFormat.mBytesPerPacket;
if(!floatBuffer || floatBufferSize < newSize) {
if(newSize < (ioNumberPackets * dmFloatFormat.mBytesPerPacket))
newSize = ioNumberPackets * dmFloatFormat.mBytesPerPacket;
if(!floatBuffer || floatBufferSize < newSize)
floatBuffer = realloc(floatBuffer, floatBufferSize = newSize * 3);
}
if(stopping) {
convertEntered = NO;
return nil;
return 0;
}
size_t inputDone = 0;
size_t outputDone = 0;
if(!skipResampler) {
soxr_process(soxr, (float *)(((uint8_t *)inputBuffer) + inpOffset), inputSamples, &inputDone, floatBuffer, ioNumberPackets, &outputDone);
if(latencyEatenPost) {
// Post file or format change flush
size_t idone = 0, odone = 0;
do {
soxr_process(soxr, NULL, 0, &idone, floatBuffer + outputDone * floatFormat.mBytesPerPacket, ioNumberPackets - outputDone, &odone);
outputDone += odone;
} while(odone > 0);
}
} else {
memcpy(floatBuffer, (((uint8_t *)inputBuffer) + inpOffset), inputSamples * floatFormat.mBytesPerPacket);
inputDone = inputSamples;
outputDone = inputSamples;
}
inpOffset += inputDone * floatFormat.mBytesPerPacket;
if(latencyEaten) {
if(outputDone > latencyEaten) {
outputDone -= latencyEaten;
memmove(floatBuffer, floatBuffer + latencyEaten * floatFormat.mBytesPerPacket, outputDone * floatFormat.mBytesPerPacket);
latencyEaten = 0;
} else {
latencyEaten -= outputDone;
outputDone = 0;
}
amountReadFromFC = (int)(outputDone * floatFormat.mBytesPerPacket);
scale_by_volume((float *)floatBuffer, amountReadFromFC / sizeof(float), volumeScale
#if DSD_DECIMATE
* scaleModifier
#endif
);
floatSize = amountReadFromFC;
floatOffset = 0;
}
if(latencyEatenPost) {
if(outputDone > latencyEatenPost) {
outputDone -= latencyEatenPost;
} else {
outputDone = 0;
}
latencyEatenPost = 0;
}
if(floatOffset == floatSize)
goto tryagain;
ioNumberPackets = (UInt32)outputDone * floatFormat.mBytesPerPacket;
}
ioNumberPackets = (amount - amountRead);
if(ioNumberPackets > (floatSize - floatOffset))
ioNumberPackets = (UInt32)(floatSize - floatOffset);
if(ioNumberPackets) {
AudioChunk *chunk = [[AudioChunk alloc] init];
[chunk setFormat:nodeFormat];
if(nodeChannelConfig) {
[chunk setChannelConfig:nodeChannelConfig];
}
[self addObservers];
scale_by_volume(floatBuffer, ioNumberPackets / sizeof(float), volumeScale);
[chunk setStreamTimestamp:streamTimestamp];
[chunk setStreamTimeRatio:streamTimeRatio];
[chunk assignSamples:floatBuffer frameCount:ioNumberPackets / floatFormat.mBytesPerPacket];
streamTimestamp += [chunk durationRatioed];
convertEntered = NO;
return chunk;
}
ioNumberPackets -= ioNumberPackets % dmFloatFormat.mBytesPerPacket;
memcpy(((uint8_t *)dest) + amountRead, ((uint8_t *)floatBuffer) + floatOffset, ioNumberPackets);
floatOffset += ioNumberPackets;
amountRead += ioNumberPackets;
convertEntered = NO;
return nil;
return amountRead;
}
- (void)observeValueForKeyPath:(NSString *)keyPath
@ -420,10 +778,9 @@ static float db_to_scale(float db) {
volumeScale = scale;
}
- (BOOL)setupWithInputFormat:(AudioStreamBasicDescription)inf withInputConfig:(uint32_t)inputConfig outputFormat:(AudioStreamBasicDescription)outf isLossless:(BOOL)lossless {
- (BOOL)setupWithInputFormat:(AudioStreamBasicDescription)inf withInputConfig:(uint32_t)inputConfig isLossless:(BOOL)lossless {
// Make the converter
inputFormat = inf;
outputFormat = outf;
inputChannelConfig = inputConfig;
@ -434,6 +791,16 @@ static float db_to_scale(float db) {
if((!isFloat && !(inputFormat.mBitsPerChannel >= 1 && inputFormat.mBitsPerChannel <= 32)) || (isFloat && !(inputFormat.mBitsPerChannel == 32 || inputFormat.mBitsPerChannel == 64)))
return NO;
// These are really placeholders, as we're doing everything internally now
if(lossless &&
inputFormat.mBitsPerChannel == 16 &&
inputFormat.mChannelsPerFrame == 2 &&
inputFormat.mSampleRate == 44100) {
// possibly HDCD, run through decoder
hdcd_decoder = calloc(1, sizeof(hdcd_state_stereo_t));
hdcd_reset_stereo((hdcd_state_stereo_t *)hdcd_decoder, 44100);
}
floatFormat = inputFormat;
floatFormat.mFormatFlags = kAudioFormatFlagsNativeFloatPacked;
floatFormat.mBitsPerChannel = 32;
@ -444,46 +811,29 @@ static float db_to_scale(float db) {
if(inputFormat.mBitsPerChannel == 1) {
// Decimate this for speed
floatFormat.mSampleRate *= 1.0 / 8.0;
dsd2pcmCount = floatFormat.mChannelsPerFrame;
dsd2pcm = (void **)calloc(dsd2pcmCount, sizeof(void *));
dsd2pcm[0] = dsd2pcm_alloc();
dsd2pcmLatency = dsd2pcm_latency(dsd2pcm[0]);
for(size_t i = 1; i < dsd2pcmCount; ++i) {
dsd2pcm[i] = dsd2pcm_dup(dsd2pcm[0]);
}
}
#endif
inpOffset = 0;
inpSize = 0;
// This is a post resampler format
floatOffset = 0;
floatSize = 0;
nodeFormat = floatFormat;
nodeFormat.mSampleRate = outputFormat.mSampleRate;
// This is a post resampler, post-down/upmix format
dmFloatFormat = floatFormat;
nodeFormat = dmFloatFormat;
nodeChannelConfig = inputChannelConfig;
sampleRatio = (double)outputFormat.mSampleRate / (double)floatFormat.mSampleRate;
skipResampler = fabs(sampleRatio - 1.0) < 1e-7;
if(!skipResampler) {
soxr_quality_spec_t q_spec = soxr_quality_spec(SOXR_HQ, 0);
soxr_io_spec_t io_spec = soxr_io_spec(SOXR_FLOAT32_I, SOXR_FLOAT32_I);
soxr_runtime_spec_t runtime_spec = soxr_runtime_spec(0);
soxr_error_t error;
soxr = soxr_create(floatFormat.mSampleRate, outputFormat.mSampleRate, floatFormat.mChannelsPerFrame, &error, &io_spec, &q_spec, &runtime_spec);
if(error)
return NO;
PRIME_LEN_ = MAX(floatFormat.mSampleRate / 20, 1024u);
PRIME_LEN_ = MIN(PRIME_LEN_, 16384u);
PRIME_LEN_ = MAX(PRIME_LEN_, (unsigned int)(2 * LPC_ORDER + 1));
N_samples_to_add_ = floatFormat.mSampleRate;
N_samples_to_drop_ = outputFormat.mSampleRate;
samples_len(&N_samples_to_add_, &N_samples_to_drop_, 20, 8192u);
is_preextrapolated_ = NO;
is_postextrapolated_ = 0;
}
latencyEaten = 0;
latencyEatenPost = 0;
PrintStreamDesc(&inf);
PrintStreamDesc(&nodeFormat);
@ -499,18 +849,12 @@ static float db_to_scale(float db) {
}
- (void)dealloc {
DLog(@"Converter dealloc");
DLog(@"Decoder dealloc");
[self removeObservers];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.volumeScaling" context:kConverterNodeContext];
paused = NO;
[self cleanUp];
[super cleanUp];
}
- (void)setOutputFormat:(AudioStreamBasicDescription)format {
DLog(@"SETTING OUTPUT FORMAT!");
outputFormat = format;
}
- (void)inputFormatDidChange:(AudioStreamBasicDescription)format inputConfig:(uint32_t)inputConfig {
@ -520,7 +864,7 @@ static float db_to_scale(float db) {
usleep(500);
}
[self cleanUp];
[self setupWithInputFormat:format withInputConfig:inputConfig outputFormat:self->outputFormat isLossless:rememberedLossless];
[self setupWithInputFormat:format withInputConfig:inputConfig isLossless:rememberedLossless];
}
- (void)setRGInfo:(NSDictionary *)rgi {
@ -534,15 +878,20 @@ static float db_to_scale(float db) {
while(convertEntered) {
usleep(500);
}
if(soxr) {
soxr_delete(soxr);
soxr = NULL;
if(hdcd_decoder) {
free(hdcd_decoder);
hdcd_decoder = NULL;
}
if(extrapolateBuffer) {
free(extrapolateBuffer);
extrapolateBuffer = NULL;
extrapolateBufferSize = 0;
#if DSD_DECIMATE
if(dsd2pcm && dsd2pcmCount) {
for(size_t i = 0; i < dsd2pcmCount; ++i) {
dsd2pcm_free(dsd2pcm[i]);
dsd2pcm[i] = NULL;
}
free(dsd2pcm);
dsd2pcm = NULL;
}
#endif
if(floatBuffer) {
free(floatBuffer);
floatBuffer = NULL;
@ -553,8 +902,8 @@ static float db_to_scale(float db) {
inputBuffer = NULL;
inputBufferSize = 0;
}
inpOffset = 0;
inpSize = 0;
floatOffset = 0;
floatSize = 0;
}
- (double)secondsBuffered {

View file

@ -1,34 +0,0 @@
//
// DSPDownmixNode.h
// CogAudio
//
// Created by Christopher Snowhill on 2/13/25.
//
#ifndef DSPDownmixNode_h
#define DSPDownmixNode_h
#import <AudioToolbox/AudioToolbox.h>
#import <CogAudio/DSPNode.h>
@interface DSPDownmixNode : DSPNode {
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency;
- (BOOL)setup;
- (void)cleanUp;
- (void)resetBuffer;
- (BOOL)paused;
- (void)process;
- (AudioChunk * _Nullable)convert;
- (void)setOutputFormat:(AudioStreamBasicDescription)format withChannelConfig:(uint32_t)config;
@end
#endif /* DSPDownmixNode_h */

View file

@ -1,201 +0,0 @@
//
// DSPDownmixNode.m
// CogAudio Framework
//
// Created by Christopher Snowhill on 2/13/25.
//
#import <Foundation/Foundation.h>
#import "Downmix.h"
#import "Logging.h"
#import "DSPDownmixNode.h"
@implementation DSPDownmixNode {
DownmixProcessor *downmix;
BOOL stopping, paused;
BOOL processEntered;
BOOL formatSet;
AudioStreamBasicDescription lastInputFormat;
AudioStreamBasicDescription inputFormat;
AudioStreamBasicDescription outputFormat;
uint32_t lastInputChannelConfig, inputChannelConfig;
uint32_t outputChannelConfig;
float outBuffer[4096 * 32];
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency {
self = [super initWithController:c previous:p latency:latency];
return self;
}
- (void)dealloc {
DLog(@"Downmix dealloc");
[self setShouldContinue:NO];
[self cleanUp];
[super cleanUp];
}
- (BOOL)fullInit {
if(formatSet) {
downmix = [[DownmixProcessor alloc] initWithInputFormat:inputFormat inputConfig:inputChannelConfig andOutputFormat:outputFormat outputConfig:outputChannelConfig];
if(!downmix) {
return NO;
}
}
return YES;
}
- (void)fullShutdown {
downmix = nil;
}
- (BOOL)setup {
if(stopping)
return NO;
[self fullShutdown];
return [self fullInit];
}
- (void)cleanUp {
stopping = YES;
while(processEntered) {
usleep(500);
}
[self fullShutdown];
formatSet = NO;
}
- (void)resetBuffer {
paused = YES;
while(processEntered) {
usleep(500);
}
[buffer reset];
paused = NO;
}
- (void)setOutputFormat:(AudioStreamBasicDescription)format withChannelConfig:(uint32_t)config {
if(memcmp(&outputFormat, &format, sizeof(outputFormat)) != 0 ||
outputChannelConfig != config) {
paused = YES;
while(processEntered) {
usleep(500);
}
[buffer reset];
[self fullShutdown];
paused = NO;
}
outputFormat = format;
outputChannelConfig = config;
formatSet = YES;
}
- (BOOL)paused {
return paused;
}
- (void)process {
while([self shouldContinue] == YES) {
if(paused || endOfStream) {
usleep(500);
continue;
}
@autoreleasepool {
AudioChunk *chunk = nil;
chunk = [self convert];
if(!chunk || ![chunk frameCount]) {
if([previousNode endOfStream] == YES) {
usleep(500);
endOfStream = YES;
continue;
}
if(paused) {
continue;
}
usleep(500);
} else {
[self writeChunk:chunk];
chunk = nil;
}
}
}
}
- (AudioChunk *)convert {
if(stopping)
return nil;
processEntered = YES;
if(stopping || ([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) || [self shouldContinue] == NO) {
processEntered = NO;
return nil;
}
if(![self peekFormat:&inputFormat channelConfig:&inputChannelConfig]) {
processEntered = NO;
return nil;
}
if(!inputFormat.mSampleRate ||
!inputFormat.mBitsPerChannel ||
!inputFormat.mChannelsPerFrame ||
!inputFormat.mBytesPerFrame ||
!inputFormat.mFramesPerPacket ||
!inputFormat.mBytesPerPacket) {
processEntered = NO;
return nil;
}
if((formatSet && !downmix) ||
memcmp(&inputFormat, &lastInputFormat, sizeof(inputFormat)) != 0 ||
inputChannelConfig != lastInputChannelConfig) {
lastInputFormat = inputFormat;
lastInputChannelConfig = inputChannelConfig;
[self fullShutdown];
if(formatSet && ![self setup]) {
processEntered = NO;
return nil;
}
}
if(!downmix) {
processEntered = NO;
return [self readChunk:4096];
}
AudioChunk *chunk = [self readChunkAsFloat32:4096];
if(!chunk || ![chunk frameCount]) {
processEntered = NO;
return nil;
}
double streamTimestamp = [chunk streamTimestamp];
size_t frameCount = [chunk frameCount];
NSData *sampleData = [chunk removeSamples:frameCount];
[downmix process:[sampleData bytes] frameCount:frameCount output:&outBuffer[0]];
AudioChunk *outputChunk = [[AudioChunk alloc] init];
[outputChunk setFormat:outputFormat];
if(outputChannelConfig) {
[outputChunk setChannelConfig:outputChannelConfig];
}
if([chunk isHDCD]) [outputChunk setHDCD];
[outputChunk setStreamTimestamp:streamTimestamp];
[outputChunk setStreamTimeRatio:[chunk streamTimeRatio]];
[outputChunk assignSamples:&outBuffer[0] frameCount:frameCount];
processEntered = NO;
return outputChunk;
}
@end

View file

@ -1,31 +0,0 @@
//
// DSPEqualizerNode.h
// CogAudio
//
// Created by Christopher Snowhill on 2/11/25.
//
#ifndef DSPEqualizerNode_h
#define DSPEqualizerNode_h
#import <CogAudio/DSPNode.h>
@interface DSPEqualizerNode : DSPNode {
float *samplePtr;
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency;
- (BOOL)setup;
- (void)cleanUp;
- (void)resetBuffer;
- (BOOL)paused;
- (void)process;
- (AudioChunk * _Nullable)convert;
@end
#endif /* DSPEqualizerNode_h */

View file

@ -1,401 +0,0 @@
//
// DSPEqualizerNode.m
// CogAudio Framework
//
// Created by Christopher Snowhill on 2/11/25.
//
#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
#import <AudioUnit/AudioUnit.h>
#import <Accelerate/Accelerate.h>
#import "DSPEqualizerNode.h"
#import "OutputNode.h"
#import "Logging.h"
#import "AudioPlayer.h"
extern void scale_by_volume(float *buffer, size_t count, float volume);
static void * kDSPEqualizerNodeContext = &kDSPEqualizerNodeContext;
@implementation DSPEqualizerNode {
BOOL enableEqualizer;
BOOL equalizerInitialized;
double equalizerPreamp;
__weak AudioPlayer *audioPlayer;
AudioUnit _eq;
AudioTimeStamp timeStamp;
BOOL stopping, paused;
BOOL processEntered;
BOOL observersapplied;
AudioStreamBasicDescription lastInputFormat;
AudioStreamBasicDescription inputFormat;
uint32_t lastInputChannelConfig, inputChannelConfig;
uint32_t outputChannelConfig;
float inBuffer[4096 * 32];
float eqBuffer[4096 * 32];
float outBuffer[4096 * 32];
}
static void fillBuffers(AudioBufferList *ioData, const float *inbuffer, size_t count, size_t offset) {
const size_t channels = ioData->mNumberBuffers;
for(int i = 0; i < channels; ++i) {
const size_t maxCount = (ioData->mBuffers[i].mDataByteSize / sizeof(float)) - offset;
float *output = ((float *)ioData->mBuffers[i].mData) + offset;
const float *input = inbuffer + i;
cblas_scopy((int)((count > maxCount) ? maxCount : count), input, (int)channels, output, 1);
ioData->mBuffers[i].mNumberChannels = 1;
}
}
static void clearBuffers(AudioBufferList *ioData, size_t count, size_t offset) {
for(int i = 0; i < ioData->mNumberBuffers; ++i) {
memset((uint8_t *)ioData->mBuffers[i].mData + offset * sizeof(float), 0, count * sizeof(float));
ioData->mBuffers[i].mNumberChannels = 1;
}
}
static OSStatus eqRenderCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData) {
if(inNumberFrames > 4096 || !inRefCon) {
clearBuffers(ioData, inNumberFrames, 0);
return 0;
}
DSPEqualizerNode *_self = (__bridge DSPEqualizerNode *)inRefCon;
fillBuffers(ioData, _self->samplePtr, inNumberFrames, 0);
return 0;
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency {
self = [super initWithController:c previous:p latency:latency];
if(self) {
NSUserDefaults *defaults = [[NSUserDefaultsController sharedUserDefaultsController] defaults];
enableEqualizer = [defaults boolForKey:@"GraphicEQenable"];
float preamp = [defaults floatForKey:@"eqPreamp"];
equalizerPreamp = pow(10.0, preamp / 20.0);
OutputNode *outputNode = c;
audioPlayer = [outputNode controller];
[self addObservers];
}
return self;
}
- (void)dealloc {
DLog(@"Equalizer dealloc");
[self setShouldContinue:NO];
[self cleanUp];
[self removeObservers];
[super cleanUp];
}
- (void)addObservers {
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.GraphicEQenable" options:0 context:kDSPEqualizerNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.eqPreamp" options:0 context:kDSPEqualizerNodeContext];
observersapplied = YES;
}
- (void)removeObservers {
if(observersapplied) {
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.GraphicEQenable" context:kDSPEqualizerNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.eqPreamp" context:kDSPEqualizerNodeContext];
observersapplied = NO;
}
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if(context != kDSPEqualizerNodeContext) {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
return;
}
if([keyPath isEqualToString:@"values.GraphicEQenable"]) {
NSUserDefaults *defaults = [[NSUserDefaultsController sharedUserDefaultsController] defaults];
enableEqualizer = [defaults boolForKey:@"GraphicEQenable"];
} else if([keyPath isEqualToString:@"values.eqPreamp"]) {
NSUserDefaults *defaults = [[NSUserDefaultsController sharedUserDefaultsController] defaults];
float preamp = [defaults floatForKey:@"eqPreamp"];
equalizerPreamp = pow(10.0, preamp / 20.0);
}
}
- (AudioPlayer *)audioPlayer {
return audioPlayer;
}
- (BOOL)fullInit {
if(enableEqualizer) {
AudioComponentDescription desc;
NSError *err;
desc.componentType = kAudioUnitType_Effect;
desc.componentSubType = kAudioUnitSubType_GraphicEQ;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;
AudioComponent comp = NULL;
comp = AudioComponentFindNext(comp, &desc);
if(!comp) {
return NO;
}
OSStatus _err = AudioComponentInstanceNew(comp, &_eq);
if(err) {
return NO;
}
UInt32 value;
UInt32 size = sizeof(value);
value = 4096;
AudioUnitSetProperty(_eq, kAudioUnitProperty_MaximumFramesPerSlice,
kAudioUnitScope_Global, 0, &value, size);
value = 127;
AudioUnitSetProperty(_eq, kAudioUnitProperty_RenderQuality,
kAudioUnitScope_Global, 0, &value, size);
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProcRefCon = (__bridge void *)self;
callbackStruct.inputProc = eqRenderCallback;
AudioUnitSetProperty(_eq, kAudioUnitProperty_SetRenderCallback,
kAudioUnitScope_Input, 0, &callbackStruct, sizeof(callbackStruct));
AudioUnitReset(_eq, kAudioUnitScope_Input, 0);
AudioUnitReset(_eq, kAudioUnitScope_Output, 0);
AudioUnitReset(_eq, kAudioUnitScope_Global, 0);
AudioStreamBasicDescription asbd = inputFormat;
// Of course, non-interleaved has only one sample per frame/packet, per buffer
asbd.mFormatFlags |= kAudioFormatFlagIsNonInterleaved;
asbd.mBytesPerFrame = sizeof(float);
asbd.mBytesPerPacket = sizeof(float);
asbd.mFramesPerPacket = 1;
UInt32 maximumFrames = 4096;
AudioUnitSetProperty(_eq, kAudioUnitProperty_MaximumFramesPerSlice, kAudioUnitScope_Global, 0, &maximumFrames, sizeof(maximumFrames));
AudioUnitSetProperty(_eq, kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Input, 0, &asbd, sizeof(asbd));
AudioUnitSetProperty(_eq, kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Output, 0, &asbd, sizeof(asbd));
AudioUnitReset(_eq, kAudioUnitScope_Input, 0);
AudioUnitReset(_eq, kAudioUnitScope_Output, 0);
AudioUnitReset(_eq, kAudioUnitScope_Global, 0);
_err = AudioUnitInitialize(_eq);
if(_err != noErr) {
return NO;
}
bzero(&timeStamp, sizeof(timeStamp));
timeStamp.mFlags = kAudioTimeStampSampleTimeValid;
equalizerInitialized = YES;
[[self audioPlayer] beginEqualizer:_eq];
}
return YES;
}
- (void)fullShutdown {
if(_eq) {
if(equalizerInitialized) {
[[self audioPlayer] endEqualizer:_eq];
AudioUnitUninitialize(_eq);
equalizerInitialized = NO;
}
AudioComponentInstanceDispose(_eq);
_eq = NULL;
}
}
- (BOOL)setup {
if(stopping)
return NO;
[self fullShutdown];
return [self fullInit];
}
- (void)cleanUp {
stopping = YES;
while(processEntered) {
usleep(500);
}
[self fullShutdown];
}
- (void)resetBuffer {
paused = YES;
while(processEntered) {
usleep(500);
}
[buffer reset];
[self fullShutdown];
paused = NO;
}
- (BOOL)paused {
return paused;
}
- (void)process {
while([self shouldContinue] == YES) {
if(paused || endOfStream) {
usleep(500);
continue;
}
@autoreleasepool {
AudioChunk *chunk = nil;
chunk = [self convert];
if(!chunk || ![chunk frameCount]) {
if([previousNode endOfStream] == YES) {
usleep(500);
endOfStream = YES;
continue;
}
if(paused) {
continue;
}
usleep(500);
} else {
[self writeChunk:chunk];
chunk = nil;
}
if(!enableEqualizer && equalizerInitialized) {
[self fullShutdown];
}
}
}
}
- (AudioChunk *)convert {
if(stopping)
return nil;
processEntered = YES;
if(stopping || ([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) || [self shouldContinue] == NO) {
processEntered = NO;
return nil;
}
if(![self peekFormat:&inputFormat channelConfig:&inputChannelConfig]) {
processEntered = NO;
return nil;
}
if(!inputFormat.mSampleRate ||
!inputFormat.mBitsPerChannel ||
!inputFormat.mChannelsPerFrame ||
!inputFormat.mBytesPerFrame ||
!inputFormat.mFramesPerPacket ||
!inputFormat.mBytesPerPacket) {
processEntered = NO;
return nil;
}
if((enableEqualizer && !equalizerInitialized) ||
memcmp(&inputFormat, &lastInputFormat, sizeof(inputFormat)) != 0 ||
inputChannelConfig != lastInputChannelConfig) {
lastInputFormat = inputFormat;
lastInputChannelConfig = inputChannelConfig;
[self fullShutdown];
if(enableEqualizer && ![self setup]) {
processEntered = NO;
return nil;
}
}
if(!equalizerInitialized) {
processEntered = NO;
return [self readChunk:4096];
}
AudioChunk *chunk = [self readChunkAsFloat32:4096];
if(!chunk || ![chunk frameCount]) {
processEntered = NO;
return nil;
}
double streamTimestamp = [chunk streamTimestamp];
samplePtr = &inBuffer[0];
size_t channels = inputFormat.mChannelsPerFrame;
size_t frameCount = [chunk frameCount];
NSData *sampleData = [chunk removeSamples:frameCount];
cblas_scopy((int)(frameCount * channels), [sampleData bytes], 1, &inBuffer[0], 1);
const size_t channelsminusone = channels - 1;
uint8_t tempBuffer[sizeof(AudioBufferList) + sizeof(AudioBuffer) * channelsminusone];
AudioBufferList *ioData = (AudioBufferList *)&tempBuffer[0];
ioData->mNumberBuffers = (UInt32)channels;
for(size_t i = 0; i < channels; ++i) {
ioData->mBuffers[i].mData = &eqBuffer[4096 * i];
ioData->mBuffers[i].mDataByteSize = (UInt32)(frameCount * sizeof(float));
ioData->mBuffers[i].mNumberChannels = 1;
}
OSStatus status = AudioUnitRender(_eq, NULL, &timeStamp, 0, (UInt32)frameCount, ioData);
if(status != noErr) {
processEntered = NO;
return nil;
}
timeStamp.mSampleTime += ((double)frameCount) / inputFormat.mSampleRate;
for(int i = 0; i < channels; ++i) {
cblas_scopy((int)frameCount, &eqBuffer[4096 * i], 1, &outBuffer[i], (int)channels);
}
AudioChunk *outputChunk = nil;
if(frameCount) {
scale_by_volume(&outBuffer[0], frameCount * channels, equalizerPreamp);
outputChunk = [[AudioChunk alloc] init];
[outputChunk setFormat:inputFormat];
if(outputChannelConfig) {
[outputChunk setChannelConfig:inputChannelConfig];
}
if([chunk isHDCD]) [outputChunk setHDCD];
[outputChunk setStreamTimestamp:streamTimestamp];
[outputChunk setStreamTimeRatio:[chunk streamTimeRatio]];
[outputChunk assignSamples:&outBuffer[0] frameCount:frameCount];
}
processEntered = NO;
return outputChunk;
}
@end

View file

@ -1,30 +0,0 @@
//
// DSPFSurroundNode.h
// CogAudio
//
// Created by Christopher Snowhill on 2/11/25.
//
#ifndef DSPFSurroundNode_h
#define DSPFSurroundNode_h
#import <CogAudio/DSPNode.h>
@interface DSPFSurroundNode : DSPNode {
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency;
- (BOOL)setup;
- (void)cleanUp;
- (void)resetBuffer;
- (BOOL)paused;
- (void)process;
- (AudioChunk * _Nullable)convert;
@end
#endif /* DSPFSurroundNode_h */

View file

@ -1,275 +0,0 @@
//
// DSPFSurroundNode.m
// CogAudio Framework
//
// Created by Christopher Snowhill on 2/11/25.
//
#import <Foundation/Foundation.h>
#import <Accelerate/Accelerate.h>
#import "DSPFSurroundNode.h"
#import "Logging.h"
#import "FSurroundFilter.h"
#define OCTAVES 5
static void * kDSPFSurroundNodeContext = &kDSPFSurroundNodeContext;
@implementation DSPFSurroundNode {
BOOL enableFSurround;
BOOL FSurroundDelayRemoved;
FSurroundFilter *fsurround;
BOOL stopping, paused;
BOOL processEntered;
BOOL observersapplied;
AudioStreamBasicDescription lastInputFormat;
AudioStreamBasicDescription inputFormat;
AudioStreamBasicDescription outputFormat;
uint32_t lastInputChannelConfig, inputChannelConfig;
uint32_t outputChannelConfig;
float inBuffer[4096 * 2];
float outBuffer[8192 * 6];
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency {
self = [super initWithController:c previous:p latency:latency];
if(self) {
NSUserDefaults *defaults = [[NSUserDefaultsController sharedUserDefaultsController] defaults];
enableFSurround = [defaults boolForKey:@"enableFSurround"];
[self addObservers];
}
return self;
}
- (void)dealloc {
DLog(@"FreeSurround dealloc");
[self setShouldContinue:NO];
[self cleanUp];
[self removeObservers];
[super cleanUp];
}
- (void)addObservers {
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.enableFSurround" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPFSurroundNodeContext];
observersapplied = YES;
}
- (void)removeObservers {
if(observersapplied) {
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.enableFSurround" context:kDSPFSurroundNodeContext];
observersapplied = NO;
}
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if(context != kDSPFSurroundNodeContext) {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
return;
}
if([keyPath isEqualToString:@"values.enableFSurround"]) {
NSUserDefaults *defaults = [[NSUserDefaultsController sharedUserDefaultsController] defaults];
enableFSurround = [defaults boolForKey:@"enableFSurround"];
}
}
- (BOOL)fullInit {
if(enableFSurround && inputFormat.mChannelsPerFrame == 2) {
fsurround = [[FSurroundFilter alloc] initWithSampleRate:inputFormat.mSampleRate];
if(!fsurround) {
return NO;
}
outputFormat = inputFormat;
outputFormat.mChannelsPerFrame = [fsurround channelCount];
outputFormat.mBytesPerFrame = sizeof(float) * outputFormat.mChannelsPerFrame;
outputFormat.mBytesPerPacket = outputFormat.mBytesPerFrame * outputFormat.mFramesPerPacket;
outputChannelConfig = [fsurround channelConfig];
FSurroundDelayRemoved = NO;
} else {
fsurround = nil;
}
return YES;
}
- (void)fullShutdown {
fsurround = nil;
}
- (BOOL)setup {
if(stopping)
return NO;
[self fullShutdown];
return [self fullInit];
}
- (void)cleanUp {
stopping = YES;
while(processEntered) {
usleep(500);
}
[self fullShutdown];
}
- (void)resetBuffer {
paused = YES;
while(processEntered) {
usleep(500);
}
[buffer reset];
[self fullShutdown];
paused = NO;
}
- (BOOL)paused {
return paused;
}
- (void)process {
while([self shouldContinue] == YES) {
if(paused || endOfStream) {
usleep(500);
continue;
}
@autoreleasepool {
AudioChunk *chunk = nil;
chunk = [self convert];
if(!chunk || ![chunk frameCount]) {
if([previousNode endOfStream] == YES) {
usleep(500);
endOfStream = YES;
continue;
}
if(paused) {
continue;
}
usleep(500);
} else {
[self writeChunk:chunk];
chunk = nil;
}
if(!enableFSurround && fsurround) {
[self fullShutdown];
}
}
}
}
- (AudioChunk *)convert {
if(stopping)
return nil;
processEntered = YES;
if(stopping || ([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) || [self shouldContinue] == NO) {
processEntered = NO;
return nil;
}
if(![self peekFormat:&inputFormat channelConfig:&inputChannelConfig]) {
processEntered = NO;
return nil;
}
if(!inputFormat.mSampleRate ||
!inputFormat.mBitsPerChannel ||
!inputFormat.mChannelsPerFrame ||
!inputFormat.mBytesPerFrame ||
!inputFormat.mFramesPerPacket ||
!inputFormat.mBytesPerPacket) {
processEntered = NO;
return nil;
}
if((enableFSurround && !fsurround) ||
memcmp(&inputFormat, &lastInputFormat, sizeof(inputFormat)) != 0 ||
inputChannelConfig != lastInputChannelConfig) {
lastInputFormat = inputFormat;
lastInputChannelConfig = inputChannelConfig;
[self fullShutdown];
if(enableFSurround && ![self setup]) {
processEntered = NO;
return nil;
}
}
if(!fsurround) {
processEntered = NO;
return [self readChunk:4096];
}
size_t totalRequestedSamples = 4096;
size_t totalFrameCount = 0;
AudioChunk *chunk = [self readAndMergeChunksAsFloat32:totalRequestedSamples];
if(!chunk || ![chunk frameCount]) {
processEntered = NO;
return nil;
}
double streamTimestamp = [chunk streamTimestamp];
float *samplePtr = &inBuffer[0];
size_t frameCount = [chunk frameCount];
NSData *sampleData = [chunk removeSamples:frameCount];
cblas_scopy((int)frameCount * 2, [sampleData bytes], 1, &samplePtr[0], 1);
totalFrameCount = frameCount;
size_t countToProcess = totalFrameCount;
size_t samplesRendered;
if(countToProcess < 4096) {
bzero(&inBuffer[countToProcess * 2], (4096 - countToProcess) * 2 * sizeof(float));
countToProcess = 4096;
}
[fsurround process:&inBuffer[0] output:&outBuffer[0] count:(int)countToProcess];
samplePtr = &outBuffer[0];
samplesRendered = totalFrameCount;
if(totalFrameCount < 4096) {
bzero(&outBuffer[4096 * 6], 4096 * 2 * sizeof(float));
[fsurround process:&outBuffer[4096 * 6] output:&outBuffer[4096 * 6] count:4096];
samplesRendered += 2048;
}
if(!FSurroundDelayRemoved) {
FSurroundDelayRemoved = YES;
if(samplesRendered > 2048) {
samplePtr += 2048 * 6;
samplesRendered -= 2048;
}
}
AudioChunk *outputChunk = nil;
if(samplesRendered) {
outputChunk = [[AudioChunk alloc] init];
[outputChunk setFormat:outputFormat];
if(outputChannelConfig) {
[outputChunk setChannelConfig:outputChannelConfig];
}
if([chunk isHDCD]) [outputChunk setHDCD];
[outputChunk setStreamTimestamp:streamTimestamp];
[outputChunk setStreamTimeRatio:[chunk streamTimeRatio]];
[outputChunk assignSamples:samplePtr frameCount:samplesRendered];
}
processEntered = NO;
return outputChunk;
}
@end

View file

@ -1,35 +0,0 @@
//
// DSPHRTFNode.h
// CogAudio
//
// Created by Christopher Snowhill on 2/11/25.
//
#ifndef DSPHRTFNode_h
#define DSPHRTFNode_h
#import <simd/types.h>
#import <CogAudio/DSPNode.h>
@interface DSPHRTFNode : DSPNode {
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency;
- (BOOL)setup;
- (void)cleanUp;
- (void)resetBuffer;
- (BOOL)paused;
- (void)process;
- (AudioChunk * _Nullable)convert;
- (void)reportMotion:(simd_float4x4)matrix;
- (void)resetReferencePosition:(NSNotification *_Nullable)notification;
@end
#endif /* DSPHRTFNode_h */

View file

@ -1,434 +0,0 @@
//
// DSPHRTFNode.m
// CogAudio Framework
//
// Created by Christopher Snowhill on 2/11/25.
//
#import <Foundation/Foundation.h>
#import <CoreMotion/CoreMotion.h>
#import "Logging.h"
#import "DSPHRTFNode.h"
#import "lpc.h"
#import "HeadphoneFilter.h"
#include <AvailabilityMacros.h>
#if defined(MAC_OS_X_VERSION_14_0) && MAC_OS_X_VERSION_MAX_ALLOWED >= MAC_OS_X_VERSION_14_0
#define MOTION_MANAGER 1
#endif
static void * kDSPHRTFNodeContext = &kDSPHRTFNodeContext;
static NSString *CogPlaybackDidResetHeadTracking = @"CogPlaybackDigResetHeadTracking";
static simd_float4x4 convertMatrix(CMRotationMatrix r) {
simd_float4x4 matrix = {
simd_make_float4(r.m33, -r.m31, r.m32, 0.0f),
simd_make_float4(r.m13, -r.m11, r.m12, 0.0f),
simd_make_float4(r.m23, -r.m21, r.m22, 0.0f),
simd_make_float4(0.0f, 0.0f, 0.0f, 1.0f)
};
return matrix;
}
#ifdef MOTION_MANAGER
static NSLock *motionManagerLock = nil;
API_AVAILABLE(macos(14.0)) static CMHeadphoneMotionManager *motionManager = nil;
static DSPHRTFNode *registeredMotionListener = nil;
#endif
static void registerMotionListener(DSPHRTFNode *listener) {
#ifdef MOTION_MANAGER
if(@available(macOS 14, *)) {
[motionManagerLock lock];
if([motionManager isDeviceMotionActive]) {
[motionManager stopDeviceMotionUpdates];
}
if([motionManager isDeviceMotionAvailable]) {
registeredMotionListener = listener;
[motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMDeviceMotion * _Nullable motion, NSError * _Nullable error) {
if(motion) {
[motionManagerLock lock];
[registeredMotionListener reportMotion:convertMatrix(motion.attitude.rotationMatrix)];
[motionManagerLock unlock];
}
}];
}
[motionManagerLock unlock];
}
#endif
}
static void unregisterMotionListener(void) {
#ifdef MOTION_MANAGER
if(@available(macOS 14, *)) {
[motionManagerLock lock];
if([motionManager isDeviceMotionActive]) {
[motionManager stopDeviceMotionUpdates];
}
registeredMotionListener = nil;
[motionManagerLock unlock];
}
#endif
}
@implementation DSPHRTFNode {
BOOL enableHrtf;
BOOL enableHeadTracking;
BOOL lastEnableHeadTracking;
HeadphoneFilter *hrtf;
BOOL stopping, paused;
BOOL processEntered;
BOOL resetFilter;
size_t needPrefill;
BOOL observersapplied;
AudioStreamBasicDescription lastInputFormat;
AudioStreamBasicDescription inputFormat;
AudioStreamBasicDescription outputFormat;
uint32_t lastInputChannelConfig, inputChannelConfig;
uint32_t outputChannelConfig;
BOOL referenceMatrixSet;
BOOL rotationMatrixUpdated;
simd_float4x4 rotationMatrix;
simd_float4x4 referenceMatrix;
float prefillBuffer[4096 * 32];
float outBuffer[4096 * 2];
void *extrapolate_buffer;
size_t extrapolate_buffer_size;
}
+ (void)initialize {
#ifdef MOTION_MANAGER
motionManagerLock = [[NSLock alloc] init];
if(@available(macOS 14, *)) {
CMAuthorizationStatus status = [CMHeadphoneMotionManager authorizationStatus];
if(status == CMAuthorizationStatusDenied) {
ALog(@"Headphone motion not authorized");
return;
} else if(status == CMAuthorizationStatusAuthorized) {
ALog(@"Headphone motion authorized");
} else if(status == CMAuthorizationStatusRestricted) {
ALog(@"Headphone motion restricted");
} else if(status == CMAuthorizationStatusNotDetermined) {
ALog(@"Headphone motion status not determined; will prompt for access");
}
motionManager = [[CMHeadphoneMotionManager alloc] init];
}
#endif
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency {
self = [super initWithController:c previous:p latency:latency];
if(self) {
NSUserDefaults *defaults = [[NSUserDefaultsController sharedUserDefaultsController] defaults];
enableHrtf = [defaults boolForKey:@"enableHrtf"];
enableHeadTracking = [defaults boolForKey:@"enableHeadTracking"];
rotationMatrix = matrix_identity_float4x4;
[self addObservers];
}
return self;
}
- (void)dealloc {
DLog(@"HRTF dealloc");
[self setShouldContinue:NO];
[self cleanUp];
[self removeObservers];
[super cleanUp];
if(extrapolate_buffer) {
free(extrapolate_buffer);
extrapolate_buffer = NULL;
extrapolate_buffer_size = 0;
}
}
- (void)addObservers {
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.enableHrtf" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPHRTFNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.enableHeadTracking" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPHRTFNodeContext];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(resetReferencePosition:) name:CogPlaybackDidResetHeadTracking object:nil];
observersapplied = YES;
}
- (void)removeObservers {
if(observersapplied) {
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.enableHrtf" context:kDSPHRTFNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.enableHeadTracking" context:kDSPHRTFNodeContext];
[[NSNotificationCenter defaultCenter] removeObserver:self name:CogPlaybackDidResetHeadTracking object:nil];
observersapplied = NO;
}
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if(context != kDSPHRTFNodeContext) {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
return;
}
if([keyPath isEqualToString:@"values.enableHrtf"] ||
[keyPath isEqualToString:@"values.enableHeadTracking"]) {
NSUserDefaults *defaults = [[NSUserDefaultsController sharedUserDefaultsController] defaults];
enableHrtf = [defaults boolForKey:@"enableHrtf"];
enableHeadTracking = [defaults boolForKey:@"enableHeadTracking"];
resetFilter = YES;
}
}
- (BOOL)fullInit {
if(enableHrtf) {
NSURL *presetUrl = [[NSBundle mainBundle] URLForResource:@"SADIE_D02-96000" withExtension:@"mhr"];
rotationMatrixUpdated = NO;
simd_float4x4 matrix;
if(!referenceMatrixSet || !enableHeadTracking) {
referenceMatrixSet = NO;
matrix = matrix_identity_float4x4;
self->referenceMatrix = matrix;
if(enableHeadTracking) {
lastEnableHeadTracking = YES;
registerMotionListener(self);
} else if(lastEnableHeadTracking) {
lastEnableHeadTracking = NO;
unregisterMotionListener();
}
} else {
simd_float4x4 mirrorTransform = {
simd_make_float4(-1.0, 0.0, 0.0, 0.0),
simd_make_float4(0.0, 1.0, 0.0, 0.0),
simd_make_float4(0.0, 0.0, 1.0, 0.0),
simd_make_float4(0.0, 0.0, 0.0, 1.0)
};
matrix = simd_mul(mirrorTransform, rotationMatrix);
matrix = simd_mul(matrix, referenceMatrix);
}
hrtf = [[HeadphoneFilter alloc] initWithImpulseFile:presetUrl forSampleRate:inputFormat.mSampleRate withInputChannels:inputFormat.mChannelsPerFrame withConfig:inputChannelConfig withMatrix:matrix];
if(!hrtf) {
return NO;
}
outputFormat = inputFormat;
outputFormat.mChannelsPerFrame = 2;
outputFormat.mBytesPerFrame = sizeof(float) * outputFormat.mChannelsPerFrame;
outputFormat.mBytesPerPacket = outputFormat.mBytesPerFrame * outputFormat.mFramesPerPacket;
outputChannelConfig = AudioChannelSideLeft | AudioChannelSideRight;
resetFilter = NO;
needPrefill = [hrtf needPrefill];
} else {
if(lastEnableHeadTracking) {
lastEnableHeadTracking = NO;
unregisterMotionListener();
}
referenceMatrixSet = NO;
hrtf = nil;
}
return YES;
}
- (void)fullShutdown {
hrtf = nil;
if(lastEnableHeadTracking) {
lastEnableHeadTracking = NO;
unregisterMotionListener();
}
resetFilter = NO;
}
- (BOOL)setup {
if(stopping)
return NO;
[self fullShutdown];
return [self fullInit];
}
- (void)cleanUp {
stopping = YES;
while(processEntered) {
usleep(500);
}
[self fullShutdown];
}
- (void)resetBuffer {
paused = YES;
while(processEntered) {
usleep(500);
}
[buffer reset];
[self fullShutdown];
paused = NO;
}
- (BOOL)paused {
return paused;
}
- (void)process {
while([self shouldContinue] == YES) {
if(paused || endOfStream) {
usleep(500);
continue;
}
@autoreleasepool {
AudioChunk *chunk = nil;
chunk = [self convert];
if(!chunk || ![chunk frameCount]) {
if([previousNode endOfStream] == YES) {
usleep(500);
endOfStream = YES;
continue;
}
if(paused) {
continue;
}
usleep(500);
} else {
[self writeChunk:chunk];
chunk = nil;
}
if(resetFilter || (!enableHrtf && hrtf)) {
[self fullShutdown];
}
}
}
}
- (AudioChunk *)convert {
if(stopping)
return nil;
processEntered = YES;
if(stopping || ([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) || [self shouldContinue] == NO) {
processEntered = NO;
return nil;
}
if(![self peekFormat:&inputFormat channelConfig:&inputChannelConfig]) {
processEntered = NO;
return nil;
}
if(!inputFormat.mSampleRate ||
!inputFormat.mBitsPerChannel ||
!inputFormat.mChannelsPerFrame ||
!inputFormat.mBytesPerFrame ||
!inputFormat.mFramesPerPacket ||
!inputFormat.mBytesPerPacket) {
processEntered = NO;
return nil;
}
if((enableHrtf && !hrtf) ||
memcmp(&inputFormat, &lastInputFormat, sizeof(inputFormat)) != 0 ||
inputChannelConfig != lastInputChannelConfig) {
lastInputFormat = inputFormat;
lastInputChannelConfig = inputChannelConfig;
[self fullShutdown];
if(enableHrtf && ![self setup]) {
processEntered = NO;
return nil;
}
}
if(!hrtf) {
processEntered = NO;
return [self readChunk:4096];
}
AudioChunk *chunk = [self readChunkAsFloat32:4096];
if(!chunk || ![chunk frameCount]) {
processEntered = NO;
return nil;
}
if(rotationMatrixUpdated) {
rotationMatrixUpdated = NO;
simd_float4x4 mirrorTransform = {
simd_make_float4(-1.0, 0.0, 0.0, 0.0),
simd_make_float4(0.0, 1.0, 0.0, 0.0),
simd_make_float4(0.0, 0.0, 1.0, 0.0),
simd_make_float4(0.0, 0.0, 0.0, 1.0)
};
simd_float4x4 matrix = simd_mul(mirrorTransform, rotationMatrix);
matrix = simd_mul(matrix, referenceMatrix);
[hrtf reloadWithMatrix:matrix];
}
double streamTimestamp = [chunk streamTimestamp];
size_t frameCount = [chunk frameCount];
NSData *sampleData = [chunk removeSamples:frameCount];
if(needPrefill) {
size_t maxToUse = 4096 - needPrefill;
if(maxToUse > frameCount) {
maxToUse = frameCount;
}
size_t channels = inputFormat.mChannelsPerFrame;
memcpy(&prefillBuffer[needPrefill * channels], [sampleData bytes], maxToUse * sizeof(float) * channels);
lpc_extrapolate_bkwd(&prefillBuffer[needPrefill * channels], maxToUse, maxToUse, (int)channels, LPC_ORDER, needPrefill, &extrapolate_buffer, &extrapolate_buffer_size);
[hrtf process:&prefillBuffer[0] sampleCount:(int)needPrefill toBuffer:&outBuffer[0]];
needPrefill = 0;
}
[hrtf process:(const float *)[sampleData bytes] sampleCount:(int)frameCount toBuffer:&outBuffer[0]];
AudioChunk *outputChunk = [[AudioChunk alloc] init];
[outputChunk setFormat:outputFormat];
if(outputChannelConfig) {
[outputChunk setChannelConfig:outputChannelConfig];
}
if([chunk isHDCD]) [outputChunk setHDCD];
[outputChunk setStreamTimestamp:streamTimestamp];
[outputChunk setStreamTimeRatio:[chunk streamTimeRatio]];
[outputChunk assignSamples:&outBuffer[0] frameCount:frameCount];
processEntered = NO;
return outputChunk;
}
- (void)reportMotion:(simd_float4x4)matrix {
rotationMatrix = matrix;
if(!referenceMatrixSet) {
referenceMatrix = simd_inverse(matrix);
referenceMatrixSet = YES;
}
rotationMatrixUpdated = YES;
}
- (void)resetReferencePosition:(NSNotification *)notification {
referenceMatrixSet = NO;
}
@end

View file

@ -1,32 +0,0 @@
//
// DSPRubberbandNode.h
// CogAudio
//
// Created by Christopher Snowhill on 2/10/25.
//
#ifndef DSPRubberbandNode_h
#define DSPRubberbandNode_h
#import <CogAudio/DSPNode.h>
@interface DSPRubberbandNode : DSPNode {
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency;
- (BOOL)setup;
- (void)cleanUp;
- (void)resetBuffer;
- (BOOL)paused;
- (void)process;
- (AudioChunk * _Nullable)convert;
- (double)secondsBuffered;
@end
#endif /* DSPRubberbandNode_h */

View file

@ -1,560 +0,0 @@
//
// DSPRubberbandNode.m
// CogAudio Framework
//
// Created by Christopher Snowhill on 2/10/25.
//
#import <Foundation/Foundation.h>
#import <Accelerate/Accelerate.h>
#import "DSPRubberbandNode.h"
#import "Logging.h"
#import <rubberband/rubberband-c.h>
static void * kDSPRubberbandNodeContext = &kDSPRubberbandNodeContext;
@implementation DSPRubberbandNode {
BOOL enableRubberband;
RubberBandState ts;
RubberBandOptions tslastoptions, tsnewoptions;
size_t tschannels;
ssize_t blockSize, toDrop, samplesBuffered;
BOOL tsapplynewoptions;
BOOL tsrestartengine;
double tempo, pitch;
double lastTempo, lastPitch;
double countIn;
uint64_t countOut;
double streamTimestamp;
double streamTimeRatio;
BOOL isHDCD;
BOOL stopping, paused;
BOOL processEntered;
BOOL flushed;
BOOL observersapplied;
AudioStreamBasicDescription lastInputFormat;
AudioStreamBasicDescription inputFormat;
uint32_t lastInputChannelConfig, inputChannelConfig;
float *rsPtrs[32];
float rsInBuffer[4096 * 32];
float rsOutBuffer[65536 * 32];
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency {
self = [super initWithController:c previous:p latency:latency];
if(self) {
NSUserDefaults *defaults = [[NSUserDefaultsController sharedUserDefaultsController] defaults];
enableRubberband = ![[defaults stringForKey:@"rubberbandEngine"] isEqualToString:@"disabled"];
pitch = [defaults doubleForKey:@"pitch"];
tempo = [defaults doubleForKey:@"tempo"];
lastPitch = pitch;
lastTempo = tempo;
[self addObservers];
}
return self;
}
- (void)dealloc {
DLog(@"Rubber Band dealloc");
[self setShouldContinue:NO];
[self cleanUp];
[self removeObservers];
[super cleanUp];
}
- (void)addObservers {
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.pitch" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.tempo" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.rubberbandEngine" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.rubberbandTransients" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.rubberbandDetector" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.rubberbandPhase" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.rubberbandWindow" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.rubberbandSmoothing" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.rubberbandFormant" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.rubberbandPitch" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.rubberbandChannels" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kDSPRubberbandNodeContext];
observersapplied = YES;
}
- (void)removeObservers {
if(observersapplied) {
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.pitch" context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.tempo" context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.rubberbandEngine" context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.rubberbandTransients" context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.rubberbandDetector" context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.rubberbandPhase" context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.rubberbandWindow" context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.rubberbandSmoothing" context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.rubberbandFormant" context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.rubberbandPitch" context:kDSPRubberbandNodeContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.rubberbandChannels" context:kDSPRubberbandNodeContext];
observersapplied = NO;
}
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if(context != kDSPRubberbandNodeContext) {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
return;
}
if([keyPath isEqualToString:@"values.pitch"] ||
[keyPath isEqualToString:@"values.tempo"]) {
NSUserDefaults *defaults = [[NSUserDefaultsController sharedUserDefaultsController] defaults];
pitch = [defaults doubleForKey:@"pitch"];
tempo = [defaults doubleForKey:@"tempo"];
tsapplynewoptions = YES;
} else if([[keyPath substringToIndex:17] isEqualToString:@"values.rubberband"]) {
NSUserDefaults *defaults = [[NSUserDefaultsController sharedUserDefaultsController] defaults];
enableRubberband = ![[defaults stringForKey:@"rubberbandEngine"] isEqualToString:@"disabled"];
if(enableRubberband && ts) {
RubberBandOptions options = [self getRubberbandOptions];
RubberBandOptions changed = options ^ tslastoptions;
if(changed) {
BOOL engineR3 = !!(options & RubberBandOptionEngineFiner);
// Options which require a restart of the engine
const RubberBandOptions mustRestart = RubberBandOptionEngineFaster | RubberBandOptionEngineFiner | RubberBandOptionWindowStandard | RubberBandOptionWindowShort | RubberBandOptionWindowLong | RubberBandOptionSmoothingOff | RubberBandOptionSmoothingOn | (engineR3 ? RubberBandOptionPitchHighSpeed | RubberBandOptionPitchHighQuality | RubberBandOptionPitchHighConsistency : 0) | RubberBandOptionChannelsApart | RubberBandOptionChannelsTogether;
if(changed & mustRestart) {
tsrestartengine = YES;
} else {
tsnewoptions = options;
tsapplynewoptions = YES;
}
}
}
}
}
- (RubberBandOptions)getRubberbandOptions {
RubberBandOptions options = RubberBandOptionProcessRealTime;
NSUserDefaults *defaults = [[NSUserDefaultsController sharedUserDefaultsController] defaults];
NSString *value = [defaults stringForKey:@"rubberbandEngine"];
BOOL engineR3 = NO;
if([value isEqualToString:@"faster"]) {
options |= RubberBandOptionEngineFaster;
} else if([value isEqualToString:@"finer"]) {
options |= RubberBandOptionEngineFiner;
engineR3 = YES;
}
if(!engineR3) {
value = [defaults stringForKey:@"rubberbandTransients"];
if([value isEqualToString:@"crisp"]) {
options |= RubberBandOptionTransientsCrisp;
} else if([value isEqualToString:@"mixed"]) {
options |= RubberBandOptionTransientsMixed;
} else if([value isEqualToString:@"smooth"]) {
options |= RubberBandOptionTransientsSmooth;
}
value = [defaults stringForKey:@"rubberbandDetector"];
if([value isEqualToString:@"compound"]) {
options |= RubberBandOptionDetectorCompound;
} else if([value isEqualToString:@"percussive"]) {
options |= RubberBandOptionDetectorPercussive;
} else if([value isEqualToString:@"soft"]) {
options |= RubberBandOptionDetectorSoft;
}
value = [defaults stringForKey:@"rubberbandPhase"];
if([value isEqualToString:@"laminar"]) {
options |= RubberBandOptionPhaseLaminar;
} else if([value isEqualToString:@"independent"]) {
options |= RubberBandOptionPhaseIndependent;
}
}
value = [defaults stringForKey:@"rubberbandWindow"];
if([value isEqualToString:@"standard"]) {
options |= RubberBandOptionWindowStandard;
} else if([value isEqualToString:@"short"]) {
options |= RubberBandOptionWindowShort;
} else if([value isEqualToString:@"long"]) {
if(engineR3) {
options |= RubberBandOptionWindowStandard;
} else {
options |= RubberBandOptionWindowLong;
}
}
if(!engineR3) {
value = [defaults stringForKey:@"rubberbandSmoothing"];
if([value isEqualToString:@"off"]) {
options |= RubberBandOptionSmoothingOff;
} else if([value isEqualToString:@"on"]) {
options |= RubberBandOptionSmoothingOn;
}
}
value = [defaults stringForKey:@"rubberbandFormant"];
if([value isEqualToString:@"shifted"]) {
options |= RubberBandOptionFormantShifted;
} else if([value isEqualToString:@"preserved"]) {
options |= RubberBandOptionFormantPreserved;
}
value = [defaults stringForKey:@"rubberbandPitch"];
if([value isEqualToString:@"highspeed"]) {
options |= RubberBandOptionPitchHighSpeed;
} else if([value isEqualToString:@"highquality"]) {
options |= RubberBandOptionPitchHighQuality;
} else if([value isEqualToString:@"highconsistency"]) {
options |= RubberBandOptionPitchHighConsistency;
}
value = [defaults stringForKey:@"rubberbandChannels"];
if([value isEqualToString:@"apart"]) {
options |= RubberBandOptionChannelsApart;
} else if([value isEqualToString:@"together"]) {
options |= RubberBandOptionChannelsTogether;
}
return options;
}
- (BOOL)fullInit {
RubberBandOptions options = [self getRubberbandOptions];
tslastoptions = options;
tschannels = inputFormat.mChannelsPerFrame;
ts = rubberband_new(inputFormat.mSampleRate, (int)tschannels, options, 1.0 / tempo, pitch);
if(!ts)
return NO;
blockSize = rubberband_get_process_size_limit(ts);
toDrop = rubberband_get_start_delay(ts);
samplesBuffered = 0;
if(blockSize > 4096)
blockSize = 4096;
rubberband_set_max_process_size(ts, (unsigned int)blockSize);
for(size_t i = 0; i < 32; ++i) {
rsPtrs[i] = &rsInBuffer[4096 * i];
}
ssize_t toPad = rubberband_get_preferred_start_pad(ts);
if(toPad > 0) {
for(size_t i = 0; i < tschannels; ++i) {
memset(rsPtrs[i], 0, 4096 * sizeof(float));
}
while(toPad > 0) {
ssize_t p = toPad;
if(p > blockSize) p = blockSize;
rubberband_process(ts, (const float * const *)rsPtrs, (int)p, false);
toPad -= p;
}
}
tsapplynewoptions = NO;
tsrestartengine = NO;
flushed = NO;
countIn = 0.0;
countOut = 0;
return YES;
}
- (void)partialInit {
if(stopping || paused || !ts) return;
processEntered = YES;
RubberBandOptions changed = tslastoptions ^ tsnewoptions;
if(changed) {
tslastoptions = tsnewoptions;
BOOL engineR3 = !!(tsnewoptions & RubberBandOptionEngineFiner);
const RubberBandOptions transientsmask = RubberBandOptionTransientsCrisp | RubberBandOptionTransientsMixed | RubberBandOptionTransientsSmooth;
const RubberBandOptions detectormask = RubberBandOptionDetectorCompound | RubberBandOptionDetectorPercussive | RubberBandOptionDetectorSoft;
const RubberBandOptions phasemask = RubberBandOptionPhaseLaminar | RubberBandOptionPhaseIndependent;
const RubberBandOptions formantmask = RubberBandOptionFormantShifted | RubberBandOptionFormantPreserved;
const RubberBandOptions pitchmask = RubberBandOptionPitchHighSpeed | RubberBandOptionPitchHighQuality | RubberBandOptionPitchHighConsistency;
if(changed & transientsmask)
rubberband_set_transients_option(ts, tsnewoptions & transientsmask);
if(!engineR3) {
if(changed & detectormask)
rubberband_set_detector_option(ts, tsnewoptions & detectormask);
if(changed & phasemask)
rubberband_set_phase_option(ts, tsnewoptions & phasemask);
}
if(changed & formantmask)
rubberband_set_formant_option(ts, tsnewoptions & formantmask);
if(!engineR3 && (changed & pitchmask))
rubberband_set_pitch_option(ts, tsnewoptions & pitchmask);
}
if(fabs(pitch - lastPitch) > 1e-5 ||
fabs(tempo - lastTempo) > 1e-5) {
lastPitch = pitch;
lastTempo = tempo;
rubberband_set_pitch_scale(ts, pitch);
rubberband_set_time_ratio(ts, 1.0 / tempo);
}
tsapplynewoptions = NO;
processEntered = NO;
}
- (void)fullShutdown {
if(ts) {
rubberband_delete(ts);
ts = NULL;
}
}
- (BOOL)setup {
if(stopping)
return NO;
[self fullShutdown];
return [self fullInit];
}
- (void)cleanUp {
stopping = YES;
while(processEntered) {
usleep(500);
}
[self fullShutdown];
}
- (void)resetBuffer {
paused = YES;
while(processEntered) {
usleep(500);
}
[buffer reset];
[self fullShutdown];
paused = NO;
}
- (BOOL)paused {
return paused;
}
- (void)setPreviousNode:(id)p {
if(previousNode != p) {
paused = YES;
while(processEntered);
previousNode = p;
paused = NO;
}
}
- (void)setEndOfStream:(BOOL)e {
if(endOfStream && !e) {
[self fullShutdown];
}
[super setEndOfStream:e];
flushed = e;
}
- (void)process {
while([self shouldContinue] == YES) {
if(paused || endOfStream) {
usleep(500);
continue;
}
@autoreleasepool {
AudioChunk *chunk = nil;
chunk = [self convert];
if(!chunk || ![chunk frameCount]) {
if(!ts) {
flushed = previousNode && [[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES;
}
if(flushed) {
usleep(500);
endOfStream = YES;
continue;
}
if(paused) {
continue;
}
usleep(500);
} else {
[self writeChunk:chunk];
chunk = nil;
}
if(!enableRubberband && ts) {
[self fullShutdown];
} else if(tsrestartengine) {
[self fullShutdown];
} else if(tsapplynewoptions) {
[self partialInit];
}
}
}
}
- (AudioChunk *)convert {
if(stopping)
return nil;
processEntered = YES;
if(stopping || flushed || !previousNode || ([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) || [self shouldContinue] == NO) {
processEntered = NO;
return nil;
}
if(![self peekFormat:&inputFormat channelConfig:&inputChannelConfig]) {
processEntered = NO;
return nil;
}
if(!inputFormat.mSampleRate ||
!inputFormat.mBitsPerChannel ||
!inputFormat.mChannelsPerFrame ||
!inputFormat.mBytesPerFrame ||
!inputFormat.mFramesPerPacket ||
!inputFormat.mBytesPerPacket) {
processEntered = NO;
return nil;
}
if((enableRubberband && !ts) ||
memcmp(&inputFormat, &lastInputFormat, sizeof(inputFormat)) != 0 ||
inputChannelConfig != lastInputChannelConfig) {
lastInputFormat = inputFormat;
lastInputChannelConfig = inputChannelConfig;
[self fullShutdown];
if(enableRubberband && ![self setup]) {
processEntered = NO;
return nil;
}
}
if(!ts) {
processEntered = NO;
return [self readChunk:4096];
}
ssize_t samplesToProcess = rubberband_get_samples_required(ts);
if(samplesToProcess > blockSize)
samplesToProcess = blockSize;
int channels = (int)(inputFormat.mChannelsPerFrame);
if(samplesToProcess > 0) {
AudioChunk *chunk = [self readAndMergeChunksAsFloat32:samplesToProcess];
if(!chunk || ![chunk frameCount]) {
processEntered = NO;
return nil;
}
streamTimestamp = [chunk streamTimestamp];
streamTimeRatio = [chunk streamTimeRatio];
isHDCD = [chunk isHDCD];
size_t frameCount = [chunk frameCount];
countIn += ((double)frameCount) / tempo;
NSData *sampleData = [chunk removeSamples:frameCount];
for (size_t i = 0; i < channels; ++i) {
cblas_scopy((int)frameCount, ((const float *)[sampleData bytes]) + i, channels, rsPtrs[i], 1);
}
flushed = [[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES;
int len = (int)frameCount;
rubberband_process(ts, (const float * const *)rsPtrs, len, flushed);
}
ssize_t samplesAvailable;
while(!stopping && (samplesAvailable = rubberband_available(ts)) > 0) {
if(toDrop > 0) {
ssize_t blockDrop = toDrop;
if(blockDrop > samplesAvailable) blockDrop = samplesAvailable;
if(blockDrop > blockSize) blockDrop = blockSize;
rubberband_retrieve(ts, (float * const *)rsPtrs, (int)blockDrop);
toDrop -= blockDrop;
continue;
}
ssize_t maxAvailable = 65536 - samplesBuffered;
ssize_t samplesOut = samplesAvailable;
if(samplesOut > maxAvailable) {
samplesOut = maxAvailable;
if(samplesOut <= 0) {
break;
}
}
if(samplesOut > blockSize) samplesOut = blockSize;
rubberband_retrieve(ts, (float * const *)rsPtrs, (int)samplesOut);
for(size_t i = 0; i < channels; ++i) {
cblas_scopy((int)samplesOut, rsPtrs[i], 1, &rsOutBuffer[samplesBuffered * channels + i], channels);
}
samplesBuffered += samplesOut;
}
if(flushed) {
if(samplesBuffered > 0) {
ssize_t ideal = (ssize_t)floor(countIn + 0.5);
if(countOut + samplesBuffered > ideal) {
// Rubber Band does not account for flushing duration in real time mode
samplesBuffered = ideal - countOut;
}
}
}
AudioChunk *outputChunk = nil;
if(samplesBuffered > 0) {
outputChunk = [[AudioChunk alloc] init];
[outputChunk setFormat:inputFormat];
if(inputChannelConfig) {
[outputChunk setChannelConfig:inputChannelConfig];
}
if(isHDCD) [outputChunk setHDCD];
[outputChunk setStreamTimestamp:streamTimestamp];
[outputChunk setStreamTimeRatio:streamTimeRatio * tempo];
[outputChunk assignSamples:&rsOutBuffer[0] frameCount:samplesBuffered];
countOut += samplesBuffered;
samplesBuffered = 0;
double chunkDuration = [outputChunk duration];
streamTimestamp += chunkDuration * [outputChunk streamTimeRatio];
}
processEntered = NO;
return outputChunk;
}
- (double)secondsBuffered {
double rbBuffered = 0.0;
if(ts) {
// We don't use Rubber Band's latency function, because at least in Cog's case,
// by the time we call this function, and also, because it doesn't account for
// how much audio will be lopped off at the end of the process.
//
// Tested once, this tends to be close to zero when actually called.
rbBuffered = countIn - (double)(countOut);
if(rbBuffered < 0) {
rbBuffered = 0.0;
} else {
rbBuffered /= inputFormat.mSampleRate;
}
}
return [buffer listDuration] + rbBuffered;
}
@end

View file

@ -1,36 +0,0 @@
//
// FSurroundFilter.h
// CogAudio
//
// Created by Christopher Snowhill on 7/9/22.
//
#ifndef FSurroundFilter_h
#define FSurroundFilter_h
#import <Cocoa/Cocoa.h>
#import <stdint.h>
#define FSurroundChunkSize 4096
@interface FSurroundFilter : NSObject {
void *decoder;
void *params;
double srate;
uint32_t channelCount;
uint32_t channelConfig;
float tempBuffer[4096 * 2];
}
- (id)initWithSampleRate:(double)srate;
- (uint32_t)channelCount;
- (uint32_t)channelConfig;
- (double)srate;
- (void)process:(const float *)samplesIn output:(float *)samplesOut count:(uint32_t)count;
@end
#endif /* FSurround_h */

View file

@ -1,156 +0,0 @@
//
// FSurroundFilter.m
// CogAudio Framework
//
// Created by Christopher Snowhill on 7/9/22.
//
#import "FSurroundFilter.h"
#import "freesurround_decoder.h"
#import "AudioChunk.h"
#import <Accelerate/Accelerate.h>
#import <map>
#import <vector>
struct freesurround_params {
// the user-configurable parameters
float center_image, shift, depth, circular_wrap, focus, front_sep, rear_sep, bass_lo, bass_hi;
bool use_lfe;
channel_setup channels_fs; // FreeSurround channel setup
std::vector<unsigned> chanmap; // FreeSurround -> WFX channel index translation (derived data for faster lookup)
// construct with defaults
freesurround_params()
: center_image(0.7), shift(0), depth(1), circular_wrap(90), focus(0), front_sep(1), rear_sep(1),
bass_lo(40), bass_hi(90), use_lfe(false) {
set_channels_fs(cs_5point1);
}
// compute the WFX version of the channel setup code
unsigned channel_count() {
return (unsigned)chanmap.size();
}
unsigned channels_wfx() {
unsigned res = 0;
for(unsigned i = 0; i < chanmap.size(); res |= chanmap[i++]) {};
return res;
}
// assign a channel setup & recompute derived data
void set_channels_fs(channel_setup setup) {
channels_fs = setup;
chanmap.clear();
// Note: Because WFX does not define a few of the more exotic channels (side front left&right, side rear left&right, back center left&right),
// the side front/back channel pairs (both left and right sides, resp.) are mapped here onto foobar's top front/back channel pairs and the
// back (off-)center left/right channels are mapped onto foobar's top front center and top back center, respectively.
// Therefore, these speakers should be connected to those outputs instead.
std::map<channel_id, uint32_t> fs2wfx;
fs2wfx[ci_front_left] = AudioChannelFrontLeft;
fs2wfx[ci_front_center_left] = AudioChannelFrontCenterLeft;
fs2wfx[ci_front_center] = AudioChannelFrontCenter;
fs2wfx[ci_front_center_right] = AudioChannelFrontCenterRight;
fs2wfx[ci_front_right] = AudioChannelFrontRight;
fs2wfx[ci_side_front_left] = AudioChannelFrontLeft;
fs2wfx[ci_side_front_right] = AudioChannelTopFrontRight;
fs2wfx[ci_side_center_left] = AudioChannelSideLeft;
fs2wfx[ci_side_center_right] = AudioChannelSideRight;
fs2wfx[ci_side_back_left] = AudioChannelTopBackLeft;
fs2wfx[ci_side_back_right] = AudioChannelTopBackRight;
fs2wfx[ci_back_left] = AudioChannelBackLeft;
fs2wfx[ci_back_center_left] = AudioChannelTopFrontCenter;
fs2wfx[ci_back_center] = AudioChannelBackCenter;
fs2wfx[ci_back_center_right] = AudioChannelTopBackCenter;
fs2wfx[ci_back_right] = AudioChannelBackRight;
fs2wfx[ci_lfe] = AudioChannelLFE;
for(unsigned i = 0; i < freesurround_decoder::num_channels(channels_fs); i++)
chanmap.push_back(fs2wfx[freesurround_decoder::channel_at(channels_fs, i)]);
}
};
@implementation FSurroundFilter
- (id)initWithSampleRate:(double)srate {
self = [super init];
if(!self) return nil;
self->srate = srate;
freesurround_params *_params = new freesurround_params;
params = (void *)_params;
freesurround_decoder *_decoder = new freesurround_decoder(cs_5point1, 4096);
decoder = (void *)_decoder;
_decoder->circular_wrap(_params->circular_wrap);
_decoder->shift(_params->shift);
_decoder->depth(_params->depth);
_decoder->focus(_params->focus);
_decoder->center_image(_params->center_image);
_decoder->front_separation(_params->front_sep);
_decoder->rear_separation(_params->rear_sep);
_decoder->bass_redirection(_params->use_lfe);
_decoder->low_cutoff(_params->bass_lo / (srate / 2.0));
_decoder->high_cutoff(_params->bass_hi / (srate / 2.0));
channelCount = _params->channel_count();
channelConfig = _params->channels_wfx();
return self;
}
- (void)dealloc {
if(decoder) {
freesurround_decoder *_decoder = (freesurround_decoder *)decoder;
delete _decoder;
}
if(params) {
freesurround_params *_params = (freesurround_params *)params;
delete _params;
}
}
- (uint32_t)channelCount {
return channelCount;
}
- (uint32_t)channelConfig {
return channelConfig;
}
- (double)srate {
return srate;
}
- (void)process:(const float *)samplesIn output:(float *)samplesOut count:(uint32_t)count {
freesurround_params *_params = (freesurround_params *)params;
freesurround_decoder *_decoder = (freesurround_decoder *)decoder;
uint32_t zeroCount = 0;
if(count > 4096) {
zeroCount = count - 4096;
count = 4096;
}
if(count < 4096) {
cblas_scopy(count * 2, samplesIn, 1, &tempBuffer[0], 1);
vDSP_vclr(&tempBuffer[count * 2], 1, (4096 - count) * 2);
samplesIn = &tempBuffer[0];
}
float *src = _decoder->decode(samplesIn);
for(unsigned c = 0, num = channelCount; c < num; c++) {
unsigned idx = [AudioChunk channelIndexFromConfig:channelConfig forFlag:_params->chanmap[c]];
cblas_scopy(count, src + c, num, samplesOut + idx, num);
if(zeroCount) {
vDSP_vclr(samplesOut + idx + count, num, zeroCount);
}
}
}
@end

View file

@ -1,46 +0,0 @@
//
// HeadphoneFilter.h
// CogAudio Framework
//
// Created by Christopher Snowhill on 1/24/22.
//
#ifndef HeadphoneFilter_h
#define HeadphoneFilter_h
#import <Accelerate/Accelerate.h>
#import <Cocoa/Cocoa.h>
#import <simd/simd.h>
@interface HeadphoneFilter : NSObject {
NSURL *URL;
int bufferSize;
int paddedBufferSize;
double sampleRate;
int channelCount;
uint32_t config;
float **mirroredImpulseResponses;
float **prevInputs;
float *paddedSignal[2];
}
+ (BOOL)validateImpulseFile:(NSURL *)url;
- (id)initWithImpulseFile:(NSURL *)url forSampleRate:(double)sampleRate withInputChannels:(int)channels withConfig:(uint32_t)config withMatrix:(simd_float4x4)matrix;
- (void)reloadWithMatrix:(simd_float4x4)matrix;
- (void)process:(const float *)inBuffer sampleCount:(int)count toBuffer:(float *)outBuffer;
- (void)reset;
- (size_t)needPrefill;
@end
#endif /* HeadphoneFilter_h */

View file

@ -1,386 +0,0 @@
//
// HeadphoneFilter.m
// CogAudio Framework
//
// Created by Christopher Snowhill on 1/24/22.
//
#import "HeadphoneFilter.h"
#import "AudioChunk.h"
#import "AudioDecoder.h"
#import "AudioSource.h"
#import <stdlib.h>
#import <fstream>
#import <soxr.h>
#import "HrtfData.h"
#import "Logging.h"
typedef struct speakerPosition {
float elevation;
float azimuth;
float distance;
} speakerPosition;
#define DEGREES(x) ((x)*M_PI / 180.0)
static const speakerPosition speakerPositions[18] = {
{ .elevation = DEGREES(0.0), .azimuth = DEGREES(-30.0), .distance = 1.0 },
{ .elevation = DEGREES(0.0), .azimuth = DEGREES(+30.0), .distance = 1.0 },
{ .elevation = DEGREES(0.0), .azimuth = DEGREES(0.0), .distance = 1.0 },
{ .elevation = DEGREES(0.0), .azimuth = DEGREES(0.0), .distance = 1.0 },
{ .elevation = DEGREES(0.0), .azimuth = DEGREES(-135.0), .distance = 1.0 },
{ .elevation = DEGREES(0.0), .azimuth = DEGREES(+135.0), .distance = 1.0 },
{ .elevation = DEGREES(0.0), .azimuth = DEGREES(-15.0), .distance = 1.0 },
{ .elevation = DEGREES(0.0), .azimuth = DEGREES(+15.0), .distance = 1.0 },
{ .elevation = DEGREES(0.0), .azimuth = DEGREES(-180.0), .distance = 1.0 },
{ .elevation = DEGREES(0.0), .azimuth = DEGREES(-90.0), .distance = 1.0 },
{ .elevation = DEGREES(0.0), .azimuth = DEGREES(+90.0), .distance = 1.0 },
{ .elevation = DEGREES(+90.0), .azimuth = DEGREES(0.0), .distance = 1.0 },
{ .elevation = DEGREES(+45.0), .azimuth = DEGREES(-30.0), .distance = 1.0 },
{ .elevation = DEGREES(+45.0), .azimuth = DEGREES(0.0), .distance = 1.0 },
{ .elevation = DEGREES(+45.0), .azimuth = DEGREES(+30.0), .distance = 1.0 },
{ .elevation = DEGREES(+45.0), .azimuth = DEGREES(-135.0), .distance = 1.0 },
{ .elevation = DEGREES(+45.0), .azimuth = DEGREES(0.0), .distance = 1.0 },
{ .elevation = DEGREES(+45.0), .azimuth = DEGREES(+135.0), .distance = 1.0 }
};
static simd_float4x4 matX(float theta) {
simd_float4x4 mat = {
simd_make_float4(1.0f, 0.0f, 0.0f, 0.0f),
simd_make_float4(0.0f, cosf(theta), -sinf(theta), 0.0f),
simd_make_float4(0.0f, sinf(theta), cosf(theta), 0.0f),
simd_make_float4(0.0f, 0.0f, 0.0f, 1.0f)
};
return mat;
};
static simd_float4x4 matY(float theta) {
simd_float4x4 mat = {
simd_make_float4(cosf(theta), 0.0f, sinf(theta), 0.0f),
simd_make_float4(0.0f, 1.0f, 0.0f, 0.0f),
simd_make_float4(-sinf(theta), 0.0f, cosf(theta), 0.0f),
simd_make_float4(0.0f, 0.0f, 0.0f, 1.0f)
};
return mat;
}
#if 0
static simd_float4x4 matZ(float theta) {
simd_float4x4 mat = {
simd_make_float4(cosf(theta), -sinf(theta), 0.0f, 0.0f),
simd_make_float4(sinf(theta), cosf(theta), 0.0f, 0.0f),
simd_make_float4(0.0f, 0.0f, 1.0f, 0.0f),
simd_make_float4(0.0f, 0.0f, 0.0f, 1.0f)
};
return mat;
};
#endif
static void transformPosition(float &elevation, float &azimuth, const simd_float4x4 &matrix) {
simd_float4x4 mat_x = matX(azimuth);
simd_float4x4 mat_y = matY(elevation);
//simd_float4x4 mat_z = matrix_identity_float4x4;
simd_float4x4 offsetMatrix = simd_mul(mat_x, mat_y);
//offsetMatrix = simd_mul(offsetMatrix, mat_z);
offsetMatrix = simd_mul(offsetMatrix, matrix);
double sy = sqrt(offsetMatrix.columns[0].x * offsetMatrix.columns[0].x + offsetMatrix.columns[1].x * offsetMatrix.columns[1].x);
bool singular = sy < 1e-6; // If
float x, y/*, z*/;
if(!singular) {
x = atan2(offsetMatrix.columns[2].y, offsetMatrix.columns[2].z);
y = atan2(-offsetMatrix.columns[2].x, sy);
//z = atan2(offsetMatrix.columns[1].x, offsetMatrix.columns[0].x);
} else {
x = atan2(-offsetMatrix.columns[1].z, offsetMatrix.columns[1].y);
y = atan2(-offsetMatrix.columns[2].x, sy);
//z = 0;
}
elevation = y;
azimuth = x;
if(elevation < (M_PI * (-0.5))) {
elevation = (M_PI * (-0.5));
} else if(elevation > M_PI * 0.5) {
elevation = M_PI * 0.5;
}
while(azimuth < (M_PI * (-2.0))) {
azimuth += M_PI * 2.0;
}
while(azimuth > M_PI * 2.0) {
azimuth -= M_PI * 2.0;
}
}
@interface impulseSetCache : NSObject {
NSURL *URL;
HrtfData *data;
}
+ (impulseSetCache *)sharedController;
- (void)getImpulse:(NSURL *)url outImpulse:(float **)outImpulse outSampleCount:(int *)outSampleCount sampleRate:(double)sampleRate channelCount:(int)channelCount channelConfig:(uint32_t)channelConfig withMatrix:(simd_float4x4)matrix;
@end
@implementation impulseSetCache
static impulseSetCache *_sharedController = nil;
+ (impulseSetCache *)sharedController {
@synchronized(self) {
if(!_sharedController) {
_sharedController = [[impulseSetCache alloc] init];
}
}
return _sharedController;
}
- (id)init {
self = [super init];
if(self) {
data = NULL;
}
return self;
}
- (void)dealloc {
delete data;
}
- (void)getImpulse:(NSURL *)url outImpulse:(float **)outImpulse outSampleCount:(int *)outSampleCount sampleRate:(double)sampleRate channelCount:(int)channelCount channelConfig:(uint32_t)channelConfig withMatrix:(simd_float4x4)matrix {
double sampleRateOfSource = 0;
int sampleCount = 0;
if(!data || ![url isEqualTo:URL]) {
delete data;
data = NULL;
URL = url;
NSString *filePath = [url path];
try {
std::ifstream file([filePath UTF8String], std::fstream::binary);
if(!file.is_open()) {
throw std::logic_error("Cannot open file.");
}
data = new HrtfData(file);
file.close();
} catch(std::exception &e) {
ALog(@"Exception caught: %s", e.what());
}
}
try {
soxr_quality_spec_t q_spec = soxr_quality_spec(SOXR_HQ, 0);
soxr_io_spec_t io_spec = soxr_io_spec(SOXR_FLOAT32_I, SOXR_FLOAT32_I);
soxr_runtime_spec_t runtime_spec = soxr_runtime_spec(0);
bool resampling;
sampleRateOfSource = data->get_sample_rate();
resampling = !!(fabs(sampleRateOfSource - sampleRate) > 1e-6);
uint32_t sampleCountResampled;
uint32_t sampleCountExact = data->get_response_length();
sampleCount = sampleCountExact + ((data->get_longest_delay() + 2) >> 2);
uint32_t actualSampleCount = sampleCount;
if(resampling) {
sampleCountResampled = (uint32_t)(((double)sampleCountExact) * sampleRate / sampleRateOfSource);
actualSampleCount = (uint32_t)(((double)actualSampleCount) * sampleRate / sampleRateOfSource);
io_spec.scale = sampleRateOfSource / sampleRate;
}
actualSampleCount = (actualSampleCount + 15) & ~15;
*outImpulse = (float *)calloc(sizeof(float), actualSampleCount * channelCount * 2);
if(!*outImpulse) {
throw std::bad_alloc();
}
float *hrtfData = *outImpulse;
for(uint32_t i = 0; i < channelCount; ++i) {
uint32_t channelFlag = [AudioChunk extractChannelFlag:i fromConfig:channelConfig];
uint32_t channelNumber = [AudioChunk findChannelIndex:channelFlag];
if(channelNumber < 18) {
const speakerPosition &speaker = speakerPositions[channelNumber];
DirectionData hrtfLeft;
DirectionData hrtfRight;
float azimuth = speaker.azimuth;
float elevation = speaker.elevation;
transformPosition(elevation, azimuth, matrix);
data->get_direction_data(elevation, azimuth, speaker.distance, hrtfLeft, hrtfRight);
if(resampling) {
ssize_t leftDelay = (ssize_t)((double)(hrtfLeft.delay) * 0.25 * sampleRate / sampleRateOfSource);
ssize_t rightDelay = (ssize_t)((double)(hrtfRight.delay) * 0.25 * sampleRate / sampleRateOfSource);
soxr_oneshot(sampleRateOfSource, sampleRate, 1, &hrtfLeft.impulse_response[0], sampleCountExact, NULL, &hrtfData[leftDelay + actualSampleCount * i * 2], sampleCountResampled, NULL, &io_spec, &q_spec, &runtime_spec);
soxr_oneshot(sampleRateOfSource, sampleRate, 1, &hrtfRight.impulse_response[0], sampleCountExact, NULL, &hrtfData[rightDelay + actualSampleCount * (i * 2 + 1)], sampleCountResampled, NULL, &io_spec, &q_spec, &runtime_spec);
} else {
cblas_scopy(sampleCountExact, &hrtfLeft.impulse_response[0], 1, &hrtfData[((hrtfLeft.delay + 2) >> 2) + actualSampleCount * i * 2], 1);
cblas_scopy(sampleCountExact, &hrtfRight.impulse_response[0], 1, &hrtfData[((hrtfRight.delay + 2) >> 2) + actualSampleCount * (i * 2 + 1)], 1);
}
}
}
*outSampleCount = actualSampleCount;
} catch(std::exception &e) {
ALog(@"Exception caught: %s", e.what());
}
}
@end
@implementation HeadphoneFilter
+ (BOOL)validateImpulseFile:(NSURL *)url {
NSString *filePath = [url path];
try {
std::ifstream file([filePath UTF8String], std::fstream::binary);
if(!file.is_open()) {
throw std::logic_error("Cannot open file.");
}
HrtfData data(file);
file.close();
return YES;
} catch(std::exception &e) {
ALog(@"Exception thrown: %s", e.what());
return NO;
}
}
- (id)initWithImpulseFile:(NSURL *)url forSampleRate:(double)sampleRate withInputChannels:(int)channels withConfig:(uint32_t)config withMatrix:(simd_float4x4)matrix {
self = [super init];
if(self) {
URL = url;
self->sampleRate = sampleRate;
channelCount = channels;
self->config = config;
float *impulseBuffer = NULL;
int sampleCount = 0;
[[impulseSetCache sharedController] getImpulse:url outImpulse:&impulseBuffer outSampleCount:&sampleCount sampleRate:sampleRate channelCount:channels channelConfig:config withMatrix:matrix];
if(!impulseBuffer) {
return nil;
}
mirroredImpulseResponses = (float **)calloc(sizeof(float *), channelCount * 2);
if(!mirroredImpulseResponses) {
free(impulseBuffer);
return nil;
}
for(int i = 0; i < channelCount * 2; ++i) {
mirroredImpulseResponses[i] = &impulseBuffer[sampleCount * i];
vDSP_vrvrs(mirroredImpulseResponses[i], 1, sampleCount);
}
paddedBufferSize = sampleCount;
paddedSignal[0] = (float *)calloc(sizeof(float), paddedBufferSize * 2);
if(!paddedSignal[0]) {
return nil;
}
paddedSignal[1] = paddedSignal[0] + paddedBufferSize;
prevInputs = (float **)calloc(channels, sizeof(float *));
if(!prevInputs)
return nil;
prevInputs[0] = (float *)calloc(sizeof(float), sampleCount * channelCount);
if(!prevInputs[0])
return nil;
for(int i = 1; i < channels; ++i) {
prevInputs[i] = prevInputs[i - 1] + sampleCount;
}
}
return self;
}
- (void)dealloc {
if(paddedSignal[0]) {
free(paddedSignal[0]);
}
if(prevInputs) {
if(prevInputs[0]) {
free(prevInputs[0]);
}
free(prevInputs);
}
if(mirroredImpulseResponses) {
if(mirroredImpulseResponses[0]) {
free(mirroredImpulseResponses[0]);
}
free(mirroredImpulseResponses);
}
}
- (void)reloadWithMatrix:(simd_float4x4)matrix {
@synchronized (self) {
if(!mirroredImpulseResponses[0]) {
return;
}
free(mirroredImpulseResponses[0]);
float *impulseBuffer = NULL;
int sampleCount = 0;
[[impulseSetCache sharedController] getImpulse:URL outImpulse:&impulseBuffer outSampleCount:&sampleCount sampleRate:sampleRate channelCount:channelCount channelConfig:config withMatrix:matrix];
for(int i = 0; i < channelCount * 2; ++i) {
mirroredImpulseResponses[i] = &impulseBuffer[sampleCount * i];
vDSP_vrvrs(mirroredImpulseResponses[i], 1, sampleCount);
}
}
}
- (void)process:(const float *)inBuffer sampleCount:(int)count toBuffer:(float *)outBuffer {
@synchronized (self) {
int sampleCount = paddedBufferSize;
while(count > 0) {
float left = 0, right = 0;
for(int i = 0; i < channelCount; ++i) {
float thisleft, thisright;
vDSP_vmul(prevInputs[i], 1, mirroredImpulseResponses[i * 2], 1, paddedSignal[0], 1, sampleCount);
vDSP_vmul(prevInputs[i], 1, mirroredImpulseResponses[i * 2 + 1], 1, paddedSignal[1], 1, sampleCount);
vDSP_sve(paddedSignal[0], 1, &thisleft, sampleCount);
vDSP_sve(paddedSignal[1], 1, &thisright, sampleCount);
left += thisleft;
right += thisright;
memmove(prevInputs[i], prevInputs[i] + 1, sizeof(float) * (sampleCount - 1));
prevInputs[i][sampleCount - 1] = *inBuffer++;
}
outBuffer[0] = left;
outBuffer[1] = right;
outBuffer += 2;
--count;
}
}
}
- (void)reset {
for(int i = 0; i < channelCount; ++i) {
vDSP_vclr(prevInputs[i], 1, paddedBufferSize);
}
}
- (size_t)needPrefill {
return paddedBufferSize;
}
@end

View file

@ -1,26 +0,0 @@
//
// DSPNode.h
// CogAudio
//
// Created by Christopher Snowhill on 2/10/25.
//
#ifndef DSPNode_h
#define DSPNode_h
#import <CogAudio/Node.h>
@interface DSPNode : Node {
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency;
- (void)threadEntry:(id _Nullable)arg;
- (void)setShouldContinue:(BOOL)s;
- (double)secondsBuffered;
@end
#endif /* DSPNode_h */

View file

@ -1,76 +0,0 @@
//
// DSPNode.m
// CogAudio Framework
//
// Created by Christopher Snowhill on 2/10/25.
//
#import <Foundation/Foundation.h>
#import "DSPNode.h"
@implementation DSPNode {
BOOL threadTerminated;
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency {
self = [super init];
if(self) {
buffer = [[ChunkList alloc] initWithMaximumDuration:latency];
writeSemaphore = [[Semaphore alloc] init];
readSemaphore = [[Semaphore alloc] init];
accessLock = [[NSLock alloc] init];
initialBufferFilled = NO;
controller = c;
endOfStream = NO;
shouldContinue = YES;
nodeChannelConfig = 0;
nodeLossless = NO;
durationPrebuffer = latency * 0.25;
inWrite = NO;
inPeek = NO;
inRead = NO;
inMerge = NO;
[self setPreviousNode:p];
#ifdef LOG_CHAINS
[self initLogFiles];
#endif
}
return self;
}
// DSP threads buffer for low latency, and therefore should have high priority
- (void)threadEntry:(id _Nullable)arg {
@autoreleasepool {
NSThread *currentThread = [NSThread currentThread];
[currentThread setThreadPriority:0.75];
[currentThread setQualityOfService:NSQualityOfServiceUserInitiated];
threadTerminated = NO;
[self process];
threadTerminated = YES;
}
}
- (void)setShouldContinue:(BOOL)s {
BOOL currentShouldContinue = shouldContinue;
shouldContinue = s;
if(!currentShouldContinue && s && threadTerminated) {
[self launchThread];
}
}
- (double)secondsBuffered {
return [buffer listDuration];
}
@end

View file

@ -149,12 +149,10 @@ static void downmix_to_stereo(const float *inBuffer, int channels, uint32_t conf
static void downmix_to_mono(const float *inBuffer, int channels, uint32_t config, float *outBuffer, size_t count) {
float tempBuffer[count * 2];
if(channels > 2 || config != AudioConfigStereo) {
downmix_to_stereo(inBuffer, channels, config, tempBuffer, count);
inBuffer = tempBuffer;
channels = 2;
config = AudioConfigStereo;
}
cblas_scopy((int)count, inBuffer, 2, outBuffer, 1);
vDSP_vadd(outBuffer, 1, inBuffer + 1, 2, outBuffer, 1, count);
}
@ -283,11 +281,7 @@ static void *kDownmixProcessorContext = &kDownmixProcessorContext;
}
- (void)process:(const void *)inBuffer frameCount:(size_t)frames output:(void *)outBuffer {
if(inputFormat.mChannelsPerFrame == 2 && outConfig == AudioConfigStereo &&
inConfig == (AudioChannelSideLeft | AudioChannelSideRight)) {
// Workaround for HRTF output
memcpy(outBuffer, inBuffer, frames * outputFormat.mBytesPerPacket);
} else if(inputFormat.mChannelsPerFrame > 2 && outConfig == AudioConfigStereo) {
if(inputFormat.mChannelsPerFrame > 2 && outConfig == AudioConfigStereo) {
downmix_to_stereo((const float *)inBuffer, inputFormat.mChannelsPerFrame, inConfig, (float *)outBuffer, frames);
} else if(inputFormat.mChannelsPerFrame > 1 && outConfig == AudioConfigMono) {
downmix_to_mono((const float *)inBuffer, inputFormat.mChannelsPerFrame, inConfig, (float *)outBuffer, frames);

View file

@ -12,9 +12,9 @@
#import <AudioUnit/AudioUnit.h>
#import <CoreAudio/AudioHardware.h>
#import <CogAudio/AudioDecoder.h>
#import <CogAudio/Node.h>
#import <CogAudio/Plugin.h>
#import "AudioDecoder.h"
#import "Node.h"
#import "Plugin.h"
#define INPUT_NODE_SEEK
@ -33,22 +33,19 @@
Semaphore *exitAtTheEndOfTheStream;
}
@property(readonly) Semaphore * _Nonnull exitAtTheEndOfTheStream;
@property(readonly) BOOL threadExited;
@property(readonly) Semaphore *exitAtTheEndOfTheStream;
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p;
- (BOOL)openWithSource:(id<CogSource>_Nonnull)source;
- (BOOL)openWithDecoder:(id<CogDecoder>_Nonnull)d;
- (BOOL)openWithSource:(id<CogSource>)source;
- (BOOL)openWithDecoder:(id<CogDecoder>)d;
- (void)process;
- (NSDictionary *_Nonnull)properties;
- (NSDictionary *)properties;
- (void)seek:(long)frame;
- (void)registerObservers;
- (BOOL)setTrack:(NSURL *_Nonnull)track;
- (BOOL)setTrack:(NSURL *)track;
- (id<CogDecoder>_Nonnull)decoder;
- (id<CogDecoder>)decoder;
@end

View file

@ -21,14 +21,12 @@
static void *kInputNodeContext = &kInputNodeContext;
@synthesize threadExited;
@synthesize exitAtTheEndOfTheStream;
- (id)initWithController:(id)c previous:(id)p {
self = [super initWithController:c previous:p];
if(self) {
exitAtTheEndOfTheStream = [[Semaphore alloc] init];
threadExited = NO;
}
return self;
@ -144,6 +142,10 @@ static void *kInputNodeContext = &kInputNodeContext;
}
- (void)process {
int amountInBuffer = 0;
int bytesInBuffer = 0;
void *inputBuffer = malloc(CHUNK_SIZE * 8 * 18); // Maximum 18 channels, dunno what we'll receive
BOOL shouldClose = YES;
BOOL seekError = NO;
@ -159,19 +161,20 @@ static void *kInputNodeContext = &kInputNodeContext;
while([self shouldContinue] == YES && [self endOfStream] == NO) {
if(shouldSeek == YES) {
BufferChain *bufferChain = controller;
AudioPlayer *audioPlayer = [bufferChain controller];
OutputNode *outputNode = [audioPlayer output];
BufferChain *bufferChain = [[controller controller] bufferChain];
ConverterNode *converter = [bufferChain converter];
DLog(@"SEEKING! Resetting Buffer");
[outputNode resetBackwards];
amountInBuffer = 0;
// This resets the converter's buffer
[self resetBuffer];
[converter resetBuffer];
[converter inputFormatDidChange:[bufferChain inputFormat] inputConfig:[bufferChain inputConfig]];
DLog(@"Reset buffer!");
DLog(@"SEEKING!");
@autoreleasepool {
seekError = [decoder seek:seekFrame] < 0;
}
shouldSeek = NO;
DLog(@"Seeked! Resetting Buffer");
@ -182,22 +185,20 @@ static void *kInputNodeContext = &kInputNodeContext;
}
}
AudioChunk *chunk;
if(amountInBuffer < CHUNK_SIZE) {
int framesToRead = CHUNK_SIZE - amountInBuffer;
int framesRead;
@autoreleasepool {
chunk = [decoder readAudio];
framesRead = [decoder readAudio:((char *)inputBuffer) + bytesInBuffer frames:framesToRead];
}
if(chunk && [chunk frameCount]) {
@autoreleasepool {
[self writeChunk:chunk];
chunk = nil;
}
if(framesRead > 0 && !seekError) {
amountInBuffer += framesRead;
bytesInBuffer += framesRead * bytesPerFrame;
[self writeData:inputBuffer amount:bytesInBuffer];
amountInBuffer = 0;
bytesInBuffer = 0;
} else {
if(chunk) {
@autoreleasepool {
chunk = nil;
}
}
DLog(@"End of stream? %@", [self properties]);
endOfStream = YES;
@ -218,17 +219,20 @@ static void *kInputNodeContext = &kInputNodeContext;
endOfStream = NO;
shouldClose = NO;
continue;
} else {
}
else {
break;
}
}
}
}
if(shouldClose)
[decoder close];
free(inputBuffer);
[exitAtTheEndOfTheStream signal];
threadExited = YES;
DLog("Input node thread stopping");
}
@ -237,8 +241,7 @@ static void *kInputNodeContext = &kInputNodeContext;
seekFrame = frame;
shouldSeek = YES;
DLog(@"Should seek!");
[self resetBuffer];
[writeSemaphore signal];
[semaphore signal];
if(endOfStream) {
[exitAtTheEndOfTheStream signal];
@ -272,7 +275,6 @@ static void *kInputNodeContext = &kInputNodeContext;
- (void)dealloc {
DLog(@"Input Node dealloc");
[self removeObservers];
[super cleanUp];
}
- (NSDictionary *)properties {

View file

@ -6,8 +6,8 @@
// Copyright 2006 Vincent Spader. All rights reserved.
//
#import <CogAudio/ChunkList.h>
#import <CogAudio/CogSemaphore.h>
#import "ChunkList.h"
#import "Semaphore.h"
#import <Cocoa/Cocoa.h>
#import <os/workgroup.h>
@ -15,25 +15,17 @@
#define BUFFER_SIZE 1024 * 1024
#define CHUNK_SIZE 16 * 1024
//#define LOG_CHAINS 1
@interface Node : NSObject {
ChunkList *buffer;
Semaphore *writeSemaphore;
Semaphore *readSemaphore;
Semaphore *semaphore;
NSLock *accessLock;
NSRecursiveLock *accessLock;
id __weak previousNode;
id __weak controller;
BOOL shouldReset;
BOOL inWrite;
BOOL inPeek;
BOOL inRead;
BOOL inMerge;
BOOL shouldContinue;
BOOL endOfStream; // All data is now in buffer
BOOL initialBufferFilled;
@ -41,34 +33,13 @@
AudioStreamBasicDescription nodeFormat;
uint32_t nodeChannelConfig;
BOOL nodeLossless;
double durationPrebuffer;
#ifdef LOG_CHAINS
NSFileHandle *logFileOut;
NSFileHandle *logFileIn;
#endif
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p;
#ifdef LOG_CHAINS
- (void)initLogFiles;
#endif
- (void)cleanUp;
- (BOOL)paused;
- (void)writeData:(const void *_Nonnull)ptr amount:(size_t)a;
- (void)writeChunk:(AudioChunk *_Nonnull)chunk;
- (AudioChunk *_Nonnull)readChunk:(size_t)maxFrames;
- (AudioChunk *_Nonnull)readChunkAsFloat32:(size_t)maxFrames;
- (AudioChunk *_Nonnull)readAndMergeChunks:(size_t)maxFrames;
- (AudioChunk *_Nonnull)readAndMergeChunksAsFloat32:(size_t)maxFrames;
- (BOOL)peekFormat:(AudioStreamBasicDescription *_Nonnull)format channelConfig:(uint32_t *_Nonnull)config;
- (BOOL)peekTimestamp:(double *_Nonnull)timestamp timeRatio:(double *_Nonnull)timeRatio;
- (void)process; // Should be overwriten by subclass
- (void)threadEntry:(id _Nullable)arg;
@ -77,7 +48,6 @@
- (void)setShouldReset:(BOOL)s;
- (BOOL)shouldReset;
- (void)resetBackwards;
- (void)setPreviousNode:(id _Nullable)p;
- (id _Nullable)previousNode;
@ -92,8 +62,7 @@
- (uint32_t)nodeChannelConfig;
- (BOOL)nodeLossless;
- (Semaphore *_Nonnull)writeSemaphore;
- (Semaphore *_Nonnull)readSemaphore;
- (Semaphore *_Nonnull)semaphore;
//-(void)resetBuffer;

View file

@ -11,47 +11,21 @@
#import "BufferChain.h"
#import "Logging.h"
#import "OutputCoreAudio.h"
#import "OutputAVFoundation.h"
#import <pthread.h>
#import <mach/mach_time.h>
#ifdef LOG_CHAINS
#import "NSFileHandle+CreateFile.h"
static NSLock * _Node_lock = nil;
static uint64_t _Node_serial;
#endif
@implementation Node
#ifdef LOG_CHAINS
+ (void)initialize {
@synchronized (_Node_lock) {
if(!_Node_lock) {
_Node_lock = [[NSLock alloc] init];
_Node_serial = 0;
}
}
}
- (void)initLogFiles {
[_Node_lock lock];
logFileOut = [NSFileHandle fileHandleForWritingAtPath:[NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"%@_output_%08lld.raw", [self className], _Node_serial++]] createFile:YES];
logFileIn = [NSFileHandle fileHandleForWritingAtPath:[NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"%@_input_%08lld.raw", [self className], _Node_serial++]] createFile:YES];
[_Node_lock unlock];
}
#endif
- (id)initWithController:(id)c previous:(id)p {
self = [super init];
if(self) {
buffer = [[ChunkList alloc] initWithMaximumDuration:10.0];
writeSemaphore = [[Semaphore alloc] init];
readSemaphore = [[Semaphore alloc] init];
buffer = [[ChunkList alloc] initWithMaximumDuration:3.0];
semaphore = [[Semaphore alloc] init];
accessLock = [[NSLock alloc] init];
accessLock = [[NSRecursiveLock alloc] init];
initialBufferFilled = NO;
@ -62,38 +36,12 @@ static uint64_t _Node_serial;
nodeChannelConfig = 0;
nodeLossless = NO;
durationPrebuffer = 2.0;
inWrite = NO;
inPeek = NO;
inRead = NO;
inMerge = NO;
[self setPreviousNode:p];
#ifdef LOG_CHAINS
[self initLogFiles];
#endif
}
return self;
}
- (void)dealloc {
[self cleanUp];
}
- (void)cleanUp {
[self setShouldContinue:NO];
while(inWrite || inPeek || inRead || inMerge) {
[writeSemaphore signal];
if(previousNode) {
[[previousNode readSemaphore] signal];
}
usleep(500);
}
}
- (AudioStreamBasicDescription)nodeFormat {
return nodeFormat;
}
@ -107,12 +55,6 @@ static uint64_t _Node_serial;
}
- (void)writeData:(const void *)ptr amount:(size_t)amount {
inWrite = YES;
if(!shouldContinue || [self paused]) {
inWrite = NO;
return;
}
[accessLock lock];
AudioChunk *chunk = [[AudioChunk alloc] init];
@ -123,16 +65,11 @@ static uint64_t _Node_serial;
[chunk setLossless:nodeLossless];
[chunk assignSamples:ptr frameCount:amount / nodeFormat.mBytesPerPacket];
#ifdef LOG_CHAINS
if(logFileOut) {
[logFileOut writeData:[NSData dataWithBytes:ptr length:amount]];
}
#endif
const double chunkDuration = [chunk duration];
double durationLeft = [buffer maxDuration] - [buffer listDuration];
double durationList = [buffer listDuration];
double durationLeft = [buffer maxDuration] - durationList;
if(shouldContinue == YES && durationList >= durationPrebuffer) {
while(shouldContinue == YES && chunkDuration > durationLeft) {
if(durationLeft < chunkDuration) {
if(initialBufferFilled == NO) {
initialBufferFilled = YES;
if([controller respondsToSelector:@selector(initialBufferFilled:)])
@ -140,87 +77,18 @@ static uint64_t _Node_serial;
}
}
while(shouldContinue == YES && ![self paused] && durationLeft < 0.0) {
if(durationLeft < 0.0 || shouldReset) {
if(durationLeft < chunkDuration || shouldReset) {
[accessLock unlock];
[writeSemaphore timedWait:2000];
[semaphore wait];
[accessLock lock];
}
durationLeft = [buffer maxDuration] - [buffer listDuration];
}
BOOL doSignal = NO;
if([chunk frameCount]) {
[buffer addChunk:chunk];
doSignal = YES;
}
[accessLock unlock];
if(doSignal) {
[readSemaphore signal];
}
inWrite = NO;
}
- (void)writeChunk:(AudioChunk *)chunk {
inWrite = YES;
if(!shouldContinue || [self paused]) {
inWrite = NO;
return;
}
[accessLock lock];
double durationList = [buffer listDuration];
double durationLeft = [buffer maxDuration] - durationList;
if(shouldContinue == YES && durationList >= durationPrebuffer) {
if(initialBufferFilled == NO) {
initialBufferFilled = YES;
if([controller respondsToSelector:@selector(initialBufferFilled:)])
[controller performSelector:@selector(initialBufferFilled:) withObject:self];
}
}
while(shouldContinue == YES && ![self paused] && durationLeft < 0.0) {
if(previousNode && [previousNode shouldContinue] == NO) {
shouldContinue = NO;
break;
}
if(durationLeft < 0.0 || shouldReset) {
[accessLock unlock];
[writeSemaphore timedWait:2000];
[accessLock lock];
}
durationLeft = [buffer maxDuration] - [buffer listDuration];
}
BOOL doSignal = NO;
if([chunk frameCount]) {
#ifdef LOG_CHAINS
if(logFileOut) {
AudioChunk *chunkCopy = [chunk copy];
size_t frameCount = [chunkCopy frameCount];
NSData *chunkData = [chunkCopy removeSamples:frameCount];
[logFileOut writeData:chunkData];
}
#endif
[buffer addChunk:chunk];
doSignal = YES;
}
[accessLock unlock];
if(doSignal) {
[readSemaphore signal];
}
inWrite = NO;
}
// Should be overwriten by subclass.
@ -234,110 +102,21 @@ static uint64_t _Node_serial;
}
- (BOOL)peekFormat:(nonnull AudioStreamBasicDescription *)format channelConfig:(nonnull uint32_t *)config {
inPeek = YES;
if(!shouldContinue || [self paused]) {
inPeek = NO;
return NO;
}
[accessLock lock];
while(shouldContinue && ![self paused] &&
[[previousNode buffer] isEmpty] && [previousNode endOfStream] == NO) {
[accessLock unlock];
[writeSemaphore signal];
[[previousNode readSemaphore] timedWait:2000];
[accessLock lock];
}
if(!shouldContinue || [self paused]) {
[accessLock unlock];
inPeek = NO;
return NO;
}
if([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) {
[accessLock unlock];
inPeek = NO;
return NO;
}
BOOL ret = [[previousNode buffer] peekFormat:format channelConfig:config];
[accessLock unlock];
inPeek = NO;
return ret;
}
- (BOOL)peekTimestamp:(double *_Nonnull)timestamp timeRatio:(double *_Nonnull)timeRatio {
inPeek = YES;
if(!shouldContinue || [self paused]) {
inPeek = NO;
return NO;
}
[accessLock lock];
while(shouldContinue && ![self paused] &&
[[previousNode buffer] isEmpty] && [previousNode endOfStream] == NO) {
[accessLock unlock];
[writeSemaphore signal];
[[previousNode readSemaphore] timedWait:2000];
[accessLock lock];
}
if(!shouldContinue || [self paused]) {
[accessLock unlock];
inPeek = NO;
return NO;
}
if([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) {
[accessLock unlock];
inPeek = NO;
return NO;
}
BOOL ret = [[previousNode buffer] peekTimestamp:timestamp timeRatio:timeRatio];
[accessLock unlock];
inPeek = NO;
return ret;
}
- (AudioChunk *)readChunk:(size_t)maxFrames {
inRead = YES;
if(!shouldContinue || [self paused]) {
inRead = NO;
return [[AudioChunk alloc] init];
}
[accessLock lock];
while(shouldContinue && ![self paused] &&
[[previousNode buffer] isEmpty] && [previousNode endOfStream] == NO) {
[accessLock unlock];
[writeSemaphore signal];
[[previousNode readSemaphore] timedWait:2000];
[accessLock lock];
if([previousNode shouldReset] == YES) {
break;
}
}
if(!shouldContinue || [self paused]) {
[accessLock unlock];
inRead = NO;
return [[AudioChunk alloc] init];
}
if([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) {
endOfStream = YES;
[accessLock unlock];
inRead = NO;
return [[AudioChunk alloc] init];
}
@ -349,7 +128,7 @@ static uint64_t _Node_serial;
shouldReset = YES;
[previousNode setShouldReset:NO];
[[previousNode writeSemaphore] signal];
[[previousNode semaphore] signal];
}
AudioChunk *ret;
@ -361,203 +140,9 @@ static uint64_t _Node_serial;
[accessLock unlock];
if([ret frameCount]) {
[[previousNode writeSemaphore] signal];
[[previousNode semaphore] signal];
}
#ifdef LOG_CHAINS
if(logFileIn) {
AudioChunk *chunkCopy = [ret copy];
size_t frameCount = [chunkCopy frameCount];
NSData *chunkData = [chunkCopy removeSamples:frameCount];
[logFileIn writeData:chunkData];
}
#endif
inRead = NO;
return ret;
}
- (AudioChunk *)readChunkAsFloat32:(size_t)maxFrames {
inRead = YES;
if(!shouldContinue || [self paused]) {
inRead = NO;
return [[AudioChunk alloc] init];
}
[accessLock lock];
while(shouldContinue && ![self paused] &&
[[previousNode buffer] isEmpty] && [previousNode endOfStream] == NO) {
[accessLock unlock];
[writeSemaphore signal];
[[previousNode readSemaphore] timedWait:2000];
[accessLock lock];
if([previousNode shouldReset] == YES) {
break;
}
}
if(!shouldContinue || [self paused]) {
[accessLock unlock];
inRead = NO;
return [[AudioChunk alloc] init];
}
if([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) {
[accessLock unlock];
inRead = NO;
return [[AudioChunk alloc] init];
}
if([previousNode shouldReset] == YES) {
@autoreleasepool {
[buffer reset];
}
shouldReset = YES;
[previousNode setShouldReset:NO];
[[previousNode writeSemaphore] signal];
}
AudioChunk *ret;
@autoreleasepool {
ret = [[previousNode buffer] removeSamplesAsFloat32:maxFrames];
}
[accessLock unlock];
if([ret frameCount]) {
[[previousNode writeSemaphore] signal];
}
#ifdef LOG_CHAINS
if(logFileIn) {
AudioChunk *chunkCopy = [ret copy];
size_t frameCount = [chunkCopy frameCount];
NSData *chunkData = [chunkCopy removeSamples:frameCount];
[logFileIn writeData:chunkData];
}
#endif
inRead = NO;
return ret;
}
- (AudioChunk *)readAndMergeChunks:(size_t)maxFrames {
inMerge = YES;
if(!shouldContinue || [self paused]) {
inMerge = NO;
return [[AudioChunk alloc] init];
}
[accessLock lock];
if([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) {
[accessLock unlock];
inMerge = NO;
return [[AudioChunk alloc] init];
}
AudioChunk *ret;
@autoreleasepool {
ret = [[previousNode buffer] removeAndMergeSamples:maxFrames callBlock:^BOOL{
if([previousNode shouldReset] == YES) {
@autoreleasepool {
[buffer reset];
}
shouldReset = YES;
[previousNode setShouldReset:NO];
}
[accessLock unlock];
[[previousNode writeSemaphore] signal];
[[previousNode readSemaphore] timedWait:2000];
[accessLock lock];
return !shouldContinue || [self paused] || ([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES);
}];
}
[accessLock unlock];
if([ret frameCount]) {
[[previousNode writeSemaphore] signal];
#ifdef LOG_CHAINS
if(logFileIn) {
AudioChunk *chunkCopy = [ret copy];
size_t frameCount = [chunkCopy frameCount];
NSData *chunkData = [chunkCopy removeSamples:frameCount];
[logFileIn writeData:chunkData];
}
#endif
}
inMerge = NO;
return ret;
}
- (AudioChunk *)readAndMergeChunksAsFloat32:(size_t)maxFrames {
inMerge = YES;
if(!shouldContinue || [self paused]) {
inMerge = NO;
return [[AudioChunk alloc] init];
}
[accessLock lock];
if([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES) {
[accessLock unlock];
inMerge = NO;
return [[AudioChunk alloc] init];
}
AudioChunk *ret;
@autoreleasepool {
ret = [[previousNode buffer] removeAndMergeSamplesAsFloat32:maxFrames callBlock:^BOOL{
if([previousNode shouldReset] == YES) {
@autoreleasepool {
[buffer reset];
}
shouldReset = YES;
[previousNode setShouldReset:NO];
}
[accessLock unlock];
[[previousNode writeSemaphore] signal];
[[previousNode readSemaphore] timedWait:2000];
[accessLock lock];
return !shouldContinue || [self paused] || ([[previousNode buffer] isEmpty] && [previousNode endOfStream] == YES);
}];
}
[accessLock unlock];
if([ret frameCount]) {
[[previousNode writeSemaphore] signal];
#ifdef LOG_CHAINS
if(logFileIn) {
AudioChunk *chunkCopy = [ret copy];
size_t frameCount = [chunkCopy frameCount];
NSData *chunkData = [chunkCopy removeSamples:frameCount];
[logFileIn writeData:chunkData];
}
#endif
}
inMerge = NO;
return ret;
}
@ -596,31 +181,8 @@ static uint64_t _Node_serial;
}
}
- (void)lockedResetBuffer {
@autoreleasepool {
[buffer reset];
}
}
- (void)unlockedResetBuffer {
@autoreleasepool {
[accessLock lock];
[buffer reset];
[accessLock unlock];
}
}
// Implementations should override
- (BOOL)paused {
return NO;
}
- (Semaphore *)writeSemaphore {
return writeSemaphore;
}
- (Semaphore *)readSemaphore {
return readSemaphore;
- (Semaphore *)semaphore {
return semaphore;
}
- (BOOL)endOfStream {
@ -643,23 +205,4 @@ static uint64_t _Node_serial;
return 0.0;
}
// Reset everything in the chain
- (void)resetBackwards {
[accessLock lock];
if(buffer) {
[self lockedResetBuffer];
[writeSemaphore signal];
[readSemaphore signal];
}
Node *node = previousNode;
while(node) {
[node unlockedResetBuffer];
[node setShouldReset:YES];
[[node writeSemaphore] signal];
[[node readSemaphore] signal];
node = [node previousNode];
}
[accessLock unlock];
}
@end

View file

@ -12,8 +12,8 @@
#import <AudioUnit/AudioUnit.h>
#import <CoreAudio/AudioHardware.h>
#import <CogAudio/Node.h>
#import <CogAudio/OutputCoreAudio.h>
#import "Node.h"
#import "OutputAVFoundation.h"
@interface OutputNode : Node {
AudioStreamBasicDescription format;
@ -21,39 +21,37 @@
double amountPlayed;
double amountPlayedInterval;
OutputCoreAudio *output;
OutputAVFoundation *output;
BOOL paused;
BOOL started;
BOOL intervalReported;
}
- (void)beginEqualizer:(AudioUnit)eq;
- (void)refreshEqualizer:(AudioUnit)eq;
- (void)endEqualizer:(AudioUnit)eq;
- (double)amountPlayed;
- (double)amountPlayedInterval;
- (void)incrementAmountPlayed:(double)seconds;
- (void)setAmountPlayed:(double)seconds;
- (void)resetAmountPlayed;
- (void)resetAmountPlayedInterval;
- (BOOL)selectNextBuffer;
- (void)endOfInputPlayed;
- (BOOL)endOfStream;
- (BOOL)chainQueueHasTracks;
- (double)secondsBuffered;
- (void)setup;
- (void)setupWithInterval:(BOOL)resumeInterval;
- (void)process;
- (void)close;
- (void)seek:(double)time;
- (void)fadeOut;
- (void)fadeOutBackground;
- (void)fadeIn;
- (double)latency;
- (AudioChunk *)readChunk:(size_t)amount;
@ -61,16 +59,10 @@
- (AudioStreamBasicDescription)format;
- (uint32_t)config;
- (AudioStreamBasicDescription)deviceFormat;
- (uint32_t)deviceChannelConfig;
- (double)volume;
- (void)setVolume:(double)v;
- (void)setShouldContinue:(BOOL)s;
- (void)setShouldPlayOutBuffer:(BOOL)s;
- (void)pause;
- (void)resume;
@ -80,12 +72,4 @@
- (void)restartPlaybackAtCurrentPosition;
- (double)latency;
- (double)getVisLatency;
- (double)getTotalLatency;
- (id)controller;
- (id)downmix;
@end

View file

@ -9,72 +9,23 @@
#import "OutputNode.h"
#import "AudioPlayer.h"
#import "BufferChain.h"
#import "OutputCoreAudio.h"
#import "DSPRubberbandNode.h"
#import "DSPFSurroundNode.h"
#import "DSPHRTFNode.h"
#import "DSPEqualizerNode.h"
#import "VisualizationNode.h"
#import "DSPDownmixNode.h"
#import "OutputAVFoundation.h"
#import "Logging.h"
@implementation OutputNode {
BOOL DSPsLaunched;
Node *previousInput;
DSPRubberbandNode *rubberbandNode;
DSPFSurroundNode *fsurroundNode;
DSPHRTFNode *hrtfNode;
DSPEqualizerNode *equalizerNode;
DSPDownmixNode *downmixNode;
VisualizationNode *visualizationNode;
}
@implementation OutputNode
- (void)setup {
[self setupWithInterval:NO];
}
- (void)setupWithInterval:(BOOL)resumeInterval {
if(!resumeInterval) {
amountPlayed = 0.0;
amountPlayedInterval = 0.0;
intervalReported = NO;
}
paused = YES;
started = NO;
intervalReported = NO;
output = [[OutputCoreAudio alloc] initWithController:self];
output = [[OutputAVFoundation alloc] initWithController:self];
[output setup];
if(!DSPsLaunched) {
rubberbandNode = [[DSPRubberbandNode alloc] initWithController:self previous:nil latency:0.1];
if(!rubberbandNode) return;
fsurroundNode = [[DSPFSurroundNode alloc] initWithController:self previous:rubberbandNode latency:0.03];
if(!fsurroundNode) return;
equalizerNode = [[DSPEqualizerNode alloc] initWithController:self previous:fsurroundNode latency:0.03];
if(!equalizerNode) return;
hrtfNode = [[DSPHRTFNode alloc] initWithController:self previous:equalizerNode latency:0.03];
if(!hrtfNode) return;
downmixNode = [[DSPDownmixNode alloc] initWithController:self previous:hrtfNode latency:0.03];
if(!downmixNode) return;
// Approximately double the chunk size for Vis at 44100Hz
visualizationNode = [[VisualizationNode alloc] initWithController:self previous:downmixNode latency:8192.0 / 44100.0];
if(!visualizationNode) return;
[self setPreviousNode:visualizationNode];
DSPsLaunched = YES;
[self launchDSPs];
previousInput = nil;
}
}
- (void)seek:(double)time {
@ -99,19 +50,6 @@
[output resume];
}
- (void)fadeOut {
[output fadeOut];
}
- (void)fadeOutBackground {
[output fadeOutBackground];
}
- (void)fadeIn {
[self reconnectInputAndReplumb];
[output fadeIn];
}
- (void)incrementAmountPlayed:(double)seconds {
amountPlayed += seconds;
amountPlayedInterval += seconds;
@ -121,15 +59,6 @@
}
}
- (void)setAmountPlayed:(double)seconds {
double delta = seconds - amountPlayed;
if(delta > 0.0 && delta < 5.0) {
[self incrementAmountPlayed:delta];
} else if(delta) {
amountPlayed = seconds;
}
}
- (void)resetAmountPlayed {
amountPlayed = 0;
}
@ -140,11 +69,7 @@
}
- (BOOL)selectNextBuffer {
BOOL ret = [controller selectNextBuffer];
if(!ret) {
[self reconnectInputAndReplumb];
}
return ret;
return [controller selectNextBuffer];
}
- (void)endOfInputPlayed {
@ -164,74 +89,17 @@
return [buffer listDuration];
}
- (NSArray *)DSPs {
if(DSPsLaunched) {
return @[rubberbandNode, fsurroundNode, equalizerNode, hrtfNode, downmixNode, visualizationNode];
} else {
return @[];
}
}
- (BOOL)reconnectInput {
Node *finalNode = nil;
if(rubberbandNode) {
finalNode = [[controller bufferChain] finalNode];
[rubberbandNode setPreviousNode:finalNode];
}
return !!finalNode;
}
- (void)reconnectInputAndReplumb {
Node *finalNode = nil;
if(rubberbandNode) {
finalNode = [[controller bufferChain] finalNode];
[rubberbandNode setPreviousNode:finalNode];
}
NSArray *DSPs = [self DSPs];
for (Node *node in DSPs) {
[node setEndOfStream:NO];
[node setShouldContinue:YES];
}
}
- (void)launchDSPs {
NSArray *DSPs = [self DSPs];
for (Node *node in DSPs) {
[node launchThread];
}
}
- (AudioChunk *)readChunk:(size_t)amount {
@autoreleasepool {
if([self reconnectInput]) {
[self setPreviousNode:[[controller bufferChain] finalNode]];
AudioChunk *ret = [super readChunk:amount];
if((!ret || ![ret frameCount]) && [previousNode endOfStream]) {
endOfStream = YES;
/* if (n == 0) {
DLog(@"Output Buffer dry!");
}
*/
return ret;
} else {
return [[AudioChunk alloc] init];
}
}
}
- (BOOL)peekFormat:(nonnull AudioStreamBasicDescription *)format channelConfig:(nonnull uint32_t *)config {
@autoreleasepool {
if([self reconnectInput]) {
BOOL ret = [super peekFormat:format channelConfig:config];
if(!ret && [previousNode endOfStream]) {
endOfStream = YES;
}
return ret;
} else {
return NO;
}
}
}
@ -251,59 +119,29 @@
return config;
}
- (AudioStreamBasicDescription)deviceFormat {
return [output deviceFormat];
}
- (uint32_t)deviceChannelConfig {
return [output deviceChannelConfig];
}
- (void)setFormat:(AudioStreamBasicDescription *)f channelConfig:(uint32_t)channelConfig {
if(!shouldContinue) return;
format = *f;
config = channelConfig;
// Calculate a ratio and add to double(seconds) instead, as format may change
// double oldSampleRatio = sampleRatio;
AudioPlayer *audioPlayer = controller;
BufferChain *bufferChain = [audioPlayer bufferChain];
BufferChain *bufferChain = [controller bufferChain];
if(bufferChain) {
ConverterNode *converter = [bufferChain converter];
AudioStreamBasicDescription outputFormat;
uint32_t outputChannelConfig;
BOOL formatChanged = NO;
if(converter) {
AudioStreamBasicDescription converterFormat = [converter nodeFormat];
if(memcmp(&converterFormat, &format, sizeof(converterFormat)) != 0) {
formatChanged = YES;
}
}
if(downmixNode && output && !formatChanged) {
outputFormat = [output deviceFormat];
outputChannelConfig = [output deviceChannelConfig];
AudioStreamBasicDescription currentOutputFormat = [downmixNode nodeFormat];
uint32_t currentOutputChannelConfig = [downmixNode nodeChannelConfig];
if(memcmp(&currentOutputFormat, &outputFormat, sizeof(currentOutputFormat)) != 0 ||
currentOutputChannelConfig != outputChannelConfig) {
formatChanged = YES;
}
}
if(formatChanged) {
InputNode *inputNode = [bufferChain inputNode];
if(converter) {
[converter setOutputFormat:format];
}
if(downmixNode && output) {
[downmixNode setOutputFormat:[output deviceFormat] withChannelConfig:[output deviceChannelConfig]];
}
if(inputNode) {
AudioStreamBasicDescription inputFormat = [inputNode nodeFormat];
if(converter) {
[converter inputFormatDidChange:inputFormat inputConfig:[inputNode nodeChannelConfig]];
}
[inputNode seek:(long)(amountPlayed * inputFormat.mSampleRate)];
}
// This clears the resampler buffer, but not the input buffer
// We also have to jump the play position ahead accounting for
// the data we are flushing
amountPlayed += [[converter buffer] listDuration];
AudioStreamBasicDescription inf = [bufferChain inputFormat];
uint32_t config = [bufferChain inputConfig];
format.mChannelsPerFrame = inf.mChannelsPerFrame;
format.mBytesPerFrame = ((inf.mBitsPerChannel + 7) / 8) * format.mChannelsPerFrame;
format.mBytesPerPacket = format.mBytesPerFrame * format.mFramesPerPacket;
channelConfig = config;
[converter inputFormatDidChange:[bufferChain inputFormat] inputConfig:[bufferChain inputConfig]];
}
}
}
@ -311,24 +149,6 @@
- (void)close {
[output stop];
output = nil;
if(DSPsLaunched) {
NSArray *DSPs = [self DSPs];
for(Node *node in DSPs) {
[node setShouldContinue:NO];
}
previousNode = nil;
visualizationNode = nil;
downmixNode = nil;
hrtfNode = nil;
fsurroundNode = nil;
rubberbandNode = nil;
previousInput = nil;
DSPsLaunched = NO;
}
}
- (double)volume {
return [output volume];
}
- (void)setVolume:(double)v {
@ -338,22 +158,26 @@
- (void)setShouldContinue:(BOOL)s {
[super setShouldContinue:s];
NSArray *DSPs = [self DSPs];
for(Node *node in DSPs) {
[node setShouldContinue:s];
}
// if (s == NO)
// [output stop];
}
- (void)setShouldPlayOutBuffer:(BOOL)s {
[output setShouldPlayOutBuffer:s];
}
- (BOOL)isPaused {
return paused;
}
- (void)beginEqualizer:(AudioUnit)eq {
[controller beginEqualizer:eq];
}
- (void)refreshEqualizer:(AudioUnit)eq {
[controller refreshEqualizer:eq];
}
- (void)endEqualizer:(AudioUnit)eq {
[controller endEqualizer:eq];
}
- (void)sustainHDCD {
[output sustainHDCD];
}
@ -363,28 +187,7 @@
}
- (double)latency {
double latency = 0.0;
NSArray *DSPs = [self DSPs];
for(Node *node in DSPs) {
latency += [node secondsBuffered];
}
return [output latency] + latency;
}
- (double)getVisLatency {
return [output latency] + [visualizationNode secondsBuffered];
}
- (double)getTotalLatency {
return [[controller bufferChain] secondsBuffered] + [self latency];
}
- (id)controller {
return controller;
}
- (id)downmix {
return downmixNode;
return [output latency];
}
@end

View file

@ -1,35 +0,0 @@
//
// VisualizationNode.h
// CogAudio
//
// Created by Christopher Snowhill on 2/12/25.
//
#ifndef VisualizationNode_h
#define VisualizationNode_h
#import <CogAudio/Node.h>
@interface VisualizationNode : Node {
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency;
- (void)threadEntry:(id _Nullable)arg;
- (BOOL)setup;
- (void)cleanUp;
- (BOOL)paused;
- (void)resetBuffer;
- (void)setShouldContinue:(BOOL)s;
- (void)process;
- (double)secondsBuffered;
@end
#endif /* VisualizationNode_h */

View file

@ -1,273 +0,0 @@
//
// VisualizationNode.m
// CogAudio Framework
//
// Created by Christopher Snowhill on 2/12/25.
//
#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
#import <Accelerate/Accelerate.h>
#import "Downmix.h"
#import <CogAudio/VisualizationController.h>
#import "BufferChain.h"
#import "Logging.h"
#import "rsstate.h"
#import "VisualizationNode.h"
@implementation VisualizationNode {
void *rs;
double lastVisRate;
BOOL processEntered;
BOOL stopping;
BOOL paused;
BOOL threadTerminated;
AudioStreamBasicDescription inputFormat;
AudioStreamBasicDescription visFormat; // Mono format for vis
uint32_t inputChannelConfig;
uint32_t visChannelConfig;
size_t resamplerRemain;
DownmixProcessor *downmixer;
VisualizationController *visController;
float visAudio[512];
float resamplerInput[8192];
float visTemp[8192];
}
- (id _Nullable)initWithController:(id _Nonnull)c previous:(id _Nullable)p latency:(double)latency {
self = [super init];
if(self) {
buffer = [[ChunkList alloc] initWithMaximumDuration:latency];
writeSemaphore = [[Semaphore alloc] init];
readSemaphore = [[Semaphore alloc] init];
accessLock = [[NSLock alloc] init];
initialBufferFilled = NO;
controller = c;
endOfStream = NO;
shouldContinue = YES;
nodeChannelConfig = 0;
nodeLossless = NO;
durationPrebuffer = latency * 0.25;
visController = [VisualizationController sharedController];
inWrite = NO;
inPeek = NO;
inRead = NO;
inMerge = NO;
[self setPreviousNode:p];
}
return self;
}
- (void)dealloc {
DLog(@"Visualization node dealloc");
[self setShouldContinue:NO];
[self cleanUp];
[super cleanUp];
}
// Visualization thread should be fairly high priority, too
- (void)threadEntry:(id _Nullable)arg {
@autoreleasepool {
NSThread *currentThread = [NSThread currentThread];
[currentThread setThreadPriority:0.75];
[currentThread setQualityOfService:NSQualityOfServiceUserInitiated];
threadTerminated = NO;
[self process];
threadTerminated = YES;
}
}
- (void)resetBuffer {
paused = YES;
while(processEntered) {
usleep(500);
}
[buffer reset];
[self fullShutdown];
paused = NO;
}
- (double)secondsBuffered {
return [buffer listDuration];
}
- (void)setShouldContinue:(BOOL)s {
BOOL currentShouldContinue = shouldContinue;
shouldContinue = s;
if(!currentShouldContinue && s && threadTerminated) {
[self launchThread];
}
}
- (BOOL)setup {
if(fabs(inputFormat.mSampleRate - 44100.0) > 1e-6) {
rs = rsstate_new(1, inputFormat.mSampleRate, 44100.0);
if(!rs) {
return NO;
}
resamplerRemain = 0;
}
visFormat = inputFormat;
visFormat.mChannelsPerFrame = 1;
visFormat.mBytesPerFrame = sizeof(float);
visFormat.mBytesPerPacket = visFormat.mBytesPerFrame * visFormat.mFramesPerPacket;
visChannelConfig = AudioChannelFrontCenter;
downmixer = [[DownmixProcessor alloc] initWithInputFormat:inputFormat inputConfig:inputChannelConfig andOutputFormat:visFormat outputConfig:visChannelConfig];
if(!downmixer) {
return NO;
}
return YES;
}
- (void)cleanUp {
stopping = YES;
while(processEntered) {
usleep(500);
}
[self fullShutdown];
}
- (void)fullShutdown {
if(rs) {
rsstate_delete(rs);
rs = NULL;
}
downmixer = nil;
}
- (BOOL)paused {
return paused;
}
- (void)process {
while([self shouldContinue] == YES) {
if(paused || endOfStream) {
usleep(500);
continue;
}
@autoreleasepool {
AudioChunk *chunk = nil;
chunk = [self readAndMergeChunksAsFloat32:512];
if(!chunk || ![chunk frameCount]) {
if([previousNode endOfStream] == YES) {
usleep(500);
endOfStream = YES;
continue;
}
} else {
[self processVis:[chunk copy]];
[self writeChunk:chunk];
chunk = nil;
}
}
}
endOfStream = YES;
}
- (void)postVisPCM:(const float *)visTemp amount:(size_t)samples {
[visController postVisPCM:visTemp amount:(int)samples];
}
- (void)processVis:(AudioChunk *)chunk {
processEntered = YES;
if(paused) {
processEntered = NO;
return;
}
AudioStreamBasicDescription format = [chunk format];
uint32_t channelConfig = [chunk channelConfig];
[visController postSampleRate:44100.0];
if(!rs || !downmixer ||
memcmp(&format, &inputFormat, sizeof(format)) != 0 ||
channelConfig != inputChannelConfig) {
if(rs) {
while(!stopping) {
int samplesFlushed;
samplesFlushed = (int)rsstate_flush(rs, &visTemp[0], 8192);
if(samplesFlushed > 1) {
[self postVisPCM:visTemp amount:samplesFlushed];
} else {
break;
}
}
}
[self fullShutdown];
inputFormat = format;
inputChannelConfig = channelConfig;
if(![self setup]) {
processEntered = NO;
return;
}
}
size_t frameCount = [chunk frameCount];
NSData *sampleData = [chunk removeSamples:frameCount];
[downmixer process:[sampleData bytes] frameCount:frameCount output:&visAudio[0]];
if(rs) {
int samplesProcessed;
size_t totalDone = 0;
size_t inDone = 0;
size_t visFrameCount = frameCount;
do {
if(stopping) {
break;
}
int visTodo = (int)MIN(visFrameCount, resamplerRemain + visFrameCount - 8192);
if(visTodo) {
cblas_scopy(visTodo, &visAudio[0], 1, &resamplerInput[resamplerRemain], 1);
}
visTodo += resamplerRemain;
resamplerRemain = 0;
samplesProcessed = (int)rsstate_resample(rs, &resamplerInput[0], visTodo, &inDone, &visTemp[0], 8192);
resamplerRemain = (int)(visTodo - inDone);
if(resamplerRemain && inDone) {
memmove(&resamplerInput[0], &resamplerInput[inDone], resamplerRemain * sizeof(float));
}
if(samplesProcessed) {
[self postVisPCM:&visTemp[0] amount:samplesProcessed];
}
totalDone += inDone;
visFrameCount -= inDone;
} while(samplesProcessed && visFrameCount);
} else {
[self postVisPCM:&visAudio[0] amount:frameCount];
}
processEntered = NO;
}
@end

View file

@ -1 +0,0 @@
#import "ThirdParty/deadbeef/fft.h"

View file

@ -3,7 +3,7 @@
archiveVersion = 1;
classes = {
};
objectVersion = 54;
objectVersion = 46;
objects = {
/* Begin PBXBuildFile section */
@ -25,9 +25,11 @@
17D21CA80B8BE4BA00D1EBDE /* Node.m in Sources */ = {isa = PBXBuildFile; fileRef = 17D21C7D0B8BE4BA00D1EBDE /* Node.m */; };
17D21CA90B8BE4BA00D1EBDE /* OutputNode.h in Headers */ = {isa = PBXBuildFile; fileRef = 17D21C7E0B8BE4BA00D1EBDE /* OutputNode.h */; settings = {ATTRIBUTES = (Public, ); }; };
17D21CAA0B8BE4BA00D1EBDE /* OutputNode.m in Sources */ = {isa = PBXBuildFile; fileRef = 17D21C7F0B8BE4BA00D1EBDE /* OutputNode.m */; };
17D21CC50B8BE4BA00D1EBDE /* OutputAVFoundation.h in Headers */ = {isa = PBXBuildFile; fileRef = 17D21C9C0B8BE4BA00D1EBDE /* OutputAVFoundation.h */; settings = {ATTRIBUTES = (Public, ); }; };
17D21CC60B8BE4BA00D1EBDE /* OutputAVFoundation.m in Sources */ = {isa = PBXBuildFile; fileRef = 17D21C9D0B8BE4BA00D1EBDE /* OutputAVFoundation.m */; };
17D21CC70B8BE4BA00D1EBDE /* Status.h in Headers */ = {isa = PBXBuildFile; fileRef = 17D21C9E0B8BE4BA00D1EBDE /* Status.h */; settings = {ATTRIBUTES = (Public, ); }; };
17D21CF30B8BE5EF00D1EBDE /* CogSemaphore.h in Headers */ = {isa = PBXBuildFile; fileRef = 17D21CF10B8BE5EF00D1EBDE /* CogSemaphore.h */; settings = {ATTRIBUTES = (Public, ); }; };
17D21CF40B8BE5EF00D1EBDE /* CogSemaphore.m in Sources */ = {isa = PBXBuildFile; fileRef = 17D21CF20B8BE5EF00D1EBDE /* CogSemaphore.m */; };
17D21CF30B8BE5EF00D1EBDE /* Semaphore.h in Headers */ = {isa = PBXBuildFile; fileRef = 17D21CF10B8BE5EF00D1EBDE /* Semaphore.h */; settings = {ATTRIBUTES = (Public, ); }; };
17D21CF40B8BE5EF00D1EBDE /* Semaphore.m in Sources */ = {isa = PBXBuildFile; fileRef = 17D21CF20B8BE5EF00D1EBDE /* Semaphore.m */; };
17D21DAD0B8BE76800D1EBDE /* AudioToolbox.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 17D21DA90B8BE76800D1EBDE /* AudioToolbox.framework */; };
17D21DAE0B8BE76800D1EBDE /* AudioUnit.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 17D21DAA0B8BE76800D1EBDE /* AudioUnit.framework */; };
17D21DAF0B8BE76800D1EBDE /* CoreAudio.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 17D21DAB0B8BE76800D1EBDE /* CoreAudio.framework */; };
@ -39,38 +41,51 @@
17F94DD50B8D0F7000A34E87 /* PluginController.h in Headers */ = {isa = PBXBuildFile; fileRef = 17F94DD30B8D0F7000A34E87 /* PluginController.h */; settings = {ATTRIBUTES = (Public, ); }; };
17F94DD60B8D0F7000A34E87 /* PluginController.mm in Sources */ = {isa = PBXBuildFile; fileRef = 17F94DD40B8D0F7000A34E87 /* PluginController.mm */; };
17F94DDD0B8D101100A34E87 /* Plugin.h in Headers */ = {isa = PBXBuildFile; fileRef = 17F94DDC0B8D101100A34E87 /* Plugin.h */; settings = {ATTRIBUTES = (Public, ); }; };
831A50142865A7FD0049CFE4 /* rsstate.hpp in Headers */ = {isa = PBXBuildFile; fileRef = 831A50132865A7FD0049CFE4 /* rsstate.hpp */; };
831A50162865A8800049CFE4 /* rsstate.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 831A50152865A8800049CFE4 /* rsstate.cpp */; };
831A50182865A8B30049CFE4 /* rsstate.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A50172865A8B30049CFE4 /* rsstate.h */; };
831A4FDC2865A7DC0049CFE4 /* CDSPProcessor.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4F9D2865A7DC0049CFE4 /* CDSPProcessor.h */; };
831A4FDD2865A7DC0049CFE4 /* CDSPRealFFT.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4F9E2865A7DC0049CFE4 /* CDSPRealFFT.h */; };
831A4FDE2865A7DC0049CFE4 /* pffft_double.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FA02865A7DC0049CFE4 /* pffft_double.h */; };
831A4FDF2865A7DC0049CFE4 /* pf_neon_double_from_avx.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FA22865A7DC0049CFE4 /* pf_neon_double_from_avx.h */; };
831A4FE02865A7DC0049CFE4 /* pf_double.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FA32865A7DC0049CFE4 /* pf_double.h */; };
831A4FE12865A7DC0049CFE4 /* pf_neon_double.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FA42865A7DC0049CFE4 /* pf_neon_double.h */; };
831A4FE22865A7DC0049CFE4 /* pf_sse2_double.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FA52865A7DC0049CFE4 /* pf_sse2_double.h */; };
831A4FE32865A7DC0049CFE4 /* pf_avx_double.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FA62865A7DC0049CFE4 /* pf_avx_double.h */; };
831A4FE42865A7DC0049CFE4 /* pf_scalar_double.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FA72865A7DC0049CFE4 /* pf_scalar_double.h */; };
831A4FE52865A7DC0049CFE4 /* pffft_priv_impl.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FA82865A7DC0049CFE4 /* pffft_priv_impl.h */; };
831A4FE62865A7DC0049CFE4 /* pffft_double.c in Sources */ = {isa = PBXBuildFile; fileRef = 831A4FA92865A7DC0049CFE4 /* pffft_double.c */; };
831A4FF22865A7DC0049CFE4 /* CDSPHBUpsampler.inc in Sources */ = {isa = PBXBuildFile; fileRef = 831A4FB72865A7DC0049CFE4 /* CDSPHBUpsampler.inc */; };
831A4FF32865A7DC0049CFE4 /* r8butil.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FB82865A7DC0049CFE4 /* r8butil.h */; };
831A4FF52865A7DC0049CFE4 /* r8bbase.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FBA2865A7DC0049CFE4 /* r8bbase.h */; };
831A4FFE2865A7DC0049CFE4 /* CDSPSincFilterGen.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FC42865A7DC0049CFE4 /* CDSPSincFilterGen.h */; };
831A50072865A7DC0049CFE4 /* LICENSE in Resources */ = {isa = PBXBuildFile; fileRef = 831A4FD02865A7DC0049CFE4 /* LICENSE */; };
831A50082865A7DC0049CFE4 /* CDSPResampler.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FD12865A7DC0049CFE4 /* CDSPResampler.h */; };
831A50092865A7DC0049CFE4 /* CDSPHBUpsampler.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FD22865A7DC0049CFE4 /* CDSPHBUpsampler.h */; };
831A500B2865A7DC0049CFE4 /* CDSPBlockConvolver.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FD42865A7DC0049CFE4 /* CDSPBlockConvolver.h */; };
831A500C2865A7DC0049CFE4 /* fft4g.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FD52865A7DC0049CFE4 /* fft4g.h */; };
831A500D2865A7DC0049CFE4 /* CDSPHBDownsampler.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FD62865A7DC0049CFE4 /* CDSPHBDownsampler.h */; };
831A500E2865A7DC0049CFE4 /* r8bconf.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FD72865A7DC0049CFE4 /* r8bconf.h */; };
831A500F2865A7DC0049CFE4 /* CDSPFracInterpolator.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FD82865A7DC0049CFE4 /* CDSPFracInterpolator.h */; };
831A50102865A7DC0049CFE4 /* CDSPFIRFilter.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FD92865A7DC0049CFE4 /* CDSPFIRFilter.h */; };
831A50112865A7DC0049CFE4 /* r8bbase.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 831A4FDA2865A7DC0049CFE4 /* r8bbase.cpp */; };
831A50122865A7DC0049CFE4 /* pffft.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A4FDB2865A7DC0049CFE4 /* pffft.h */; };
831A50142865A7FD0049CFE4 /* r8bstate.hpp in Headers */ = {isa = PBXBuildFile; fileRef = 831A50132865A7FD0049CFE4 /* r8bstate.hpp */; };
831A50162865A8800049CFE4 /* r8bstate.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 831A50152865A8800049CFE4 /* r8bstate.cpp */; };
831A50182865A8B30049CFE4 /* r8bstate.h in Headers */ = {isa = PBXBuildFile; fileRef = 831A50172865A8B30049CFE4 /* r8bstate.h */; };
8328995327CB511000D7F028 /* RedundantPlaylistDataStore.m in Sources */ = {isa = PBXBuildFile; fileRef = 8328995127CB510F00D7F028 /* RedundantPlaylistDataStore.m */; };
8328995427CB511000D7F028 /* RedundantPlaylistDataStore.h in Headers */ = {isa = PBXBuildFile; fileRef = 8328995227CB511000D7F028 /* RedundantPlaylistDataStore.h */; };
8328995727CB51B700D7F028 /* SHA256Digest.h in Headers */ = {isa = PBXBuildFile; fileRef = 8328995527CB51B700D7F028 /* SHA256Digest.h */; };
8328995827CB51B700D7F028 /* SHA256Digest.m in Sources */ = {isa = PBXBuildFile; fileRef = 8328995627CB51B700D7F028 /* SHA256Digest.m */; };
8328995A27CB51C900D7F028 /* Security.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 8328995927CB51C900D7F028 /* Security.framework */; };
833442422D6EFA6700C51D38 /* VisualizationController.h in Headers */ = {isa = PBXBuildFile; fileRef = 833442402D6EFA6700C51D38 /* VisualizationController.h */; settings = {ATTRIBUTES = (Public, ); }; };
833442432D6EFA6700C51D38 /* VisualizationController.m in Sources */ = {isa = PBXBuildFile; fileRef = 833442412D6EFA6700C51D38 /* VisualizationController.m */; };
833738EA2D5EA52500278628 /* DSPDownmixNode.h in Headers */ = {isa = PBXBuildFile; fileRef = 833738E92D5EA52500278628 /* DSPDownmixNode.h */; settings = {ATTRIBUTES = (Public, ); }; };
833738EC2D5EA53500278628 /* DSPDownmixNode.m in Sources */ = {isa = PBXBuildFile; fileRef = 833738EB2D5EA53500278628 /* DSPDownmixNode.m */; };
833738EF2D5EA5B700278628 /* Downmix.m in Sources */ = {isa = PBXBuildFile; fileRef = 833738EE2D5EA5B700278628 /* Downmix.m */; };
833738F02D5EA5B700278628 /* Downmix.h in Headers */ = {isa = PBXBuildFile; fileRef = 833738ED2D5EA5B700278628 /* Downmix.h */; settings = {ATTRIBUTES = (Public, ); }; };
8347C7412796C58800FA8A7D /* NSFileHandle+CreateFile.h in Headers */ = {isa = PBXBuildFile; fileRef = 8347C73F2796C58800FA8A7D /* NSFileHandle+CreateFile.h */; };
8347C7422796C58800FA8A7D /* NSFileHandle+CreateFile.m in Sources */ = {isa = PBXBuildFile; fileRef = 8347C7402796C58800FA8A7D /* NSFileHandle+CreateFile.m */; };
834A41A9287A90AB00EB9D9B /* freesurround_decoder.h in Headers */ = {isa = PBXBuildFile; fileRef = 834A41A5287A90AB00EB9D9B /* freesurround_decoder.h */; };
834A41AA287A90AB00EB9D9B /* freesurround_decoder.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 834A41A6287A90AB00EB9D9B /* freesurround_decoder.cpp */; };
834A41AB287A90AB00EB9D9B /* channelmaps.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 834A41A7287A90AB00EB9D9B /* channelmaps.cpp */; };
834A41AC287A90AB00EB9D9B /* channelmaps.h in Headers */ = {isa = PBXBuildFile; fileRef = 834A41A8287A90AB00EB9D9B /* channelmaps.h */; };
834FD4EB27AF8F380063BC83 /* AudioChunk.h in Headers */ = {isa = PBXBuildFile; fileRef = 834FD4EA27AF8F380063BC83 /* AudioChunk.h */; settings = {ATTRIBUTES = (Public, ); }; };
834FD4EB27AF8F380063BC83 /* AudioChunk.h in Headers */ = {isa = PBXBuildFile; fileRef = 834FD4EA27AF8F380063BC83 /* AudioChunk.h */; };
834FD4ED27AF91220063BC83 /* AudioChunk.m in Sources */ = {isa = PBXBuildFile; fileRef = 834FD4EC27AF91220063BC83 /* AudioChunk.m */; };
834FD4F027AF93680063BC83 /* ChunkList.h in Headers */ = {isa = PBXBuildFile; fileRef = 834FD4EE27AF93680063BC83 /* ChunkList.h */; settings = {ATTRIBUTES = (Public, ); }; };
834FD4F027AF93680063BC83 /* ChunkList.h in Headers */ = {isa = PBXBuildFile; fileRef = 834FD4EE27AF93680063BC83 /* ChunkList.h */; };
834FD4F127AF93680063BC83 /* ChunkList.m in Sources */ = {isa = PBXBuildFile; fileRef = 834FD4EF27AF93680063BC83 /* ChunkList.m */; };
83504165286447DA006B32CC /* Downmix.h in Headers */ = {isa = PBXBuildFile; fileRef = 83504163286447DA006B32CC /* Downmix.h */; };
83504166286447DA006B32CC /* Downmix.m in Sources */ = {isa = PBXBuildFile; fileRef = 83504164286447DA006B32CC /* Downmix.m */; };
8350416D28646149006B32CC /* CoreMedia.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 8350416C28646149006B32CC /* CoreMedia.framework */; };
835C88B1279811A500E28EAE /* hdcd_decode2.h in Headers */ = {isa = PBXBuildFile; fileRef = 835C88AF279811A500E28EAE /* hdcd_decode2.h */; };
835C88B2279811A500E28EAE /* hdcd_decode2.c in Sources */ = {isa = PBXBuildFile; fileRef = 835C88B0279811A500E28EAE /* hdcd_decode2.c */; };
835DD2672ACAF1D90057E319 /* OutputCoreAudio.m in Sources */ = {isa = PBXBuildFile; fileRef = 835DD2652ACAF1D90057E319 /* OutputCoreAudio.m */; };
835DD2682ACAF1D90057E319 /* OutputCoreAudio.h in Headers */ = {isa = PBXBuildFile; fileRef = 835DD2662ACAF1D90057E319 /* OutputCoreAudio.h */; settings = {ATTRIBUTES = (Public, ); }; };
835DD2722ACAF5AD0057E319 /* lpc.h in Headers */ = {isa = PBXBuildFile; fileRef = 835DD26D2ACAF5AD0057E319 /* lpc.h */; };
835DD2732ACAF5AD0057E319 /* util.h in Headers */ = {isa = PBXBuildFile; fileRef = 835DD26E2ACAF5AD0057E319 /* util.h */; };
835DD2742ACAF5AD0057E319 /* lpc.c in Sources */ = {isa = PBXBuildFile; fileRef = 835DD26F2ACAF5AD0057E319 /* lpc.c */; };
835FAC5E27BCA14D00BA8562 /* BadSampleCleaner.h in Headers */ = {isa = PBXBuildFile; fileRef = 835FAC5C27BCA14D00BA8562 /* BadSampleCleaner.h */; };
835FAC5F27BCA14D00BA8562 /* BadSampleCleaner.m in Sources */ = {isa = PBXBuildFile; fileRef = 835FAC5D27BCA14D00BA8562 /* BadSampleCleaner.m */; };
83725A9027AA16C90003F694 /* Accelerate.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 83725A7B27AA0D8A0003F694 /* Accelerate.framework */; };
@ -78,47 +93,31 @@
8377C64C27B8C51500E8BC0F /* fft_accelerate.c in Sources */ = {isa = PBXBuildFile; fileRef = 8377C64B27B8C51500E8BC0F /* fft_accelerate.c */; };
8377C64E27B8C54400E8BC0F /* fft.h in Headers */ = {isa = PBXBuildFile; fileRef = 8377C64D27B8C54400E8BC0F /* fft.h */; };
8384912718080FF100E7332D /* Logging.h in Headers */ = {isa = PBXBuildFile; fileRef = 8384912618080FF100E7332D /* Logging.h */; };
838A33722D06A97D00D0D770 /* librubberband.3.dylib in Frameworks */ = {isa = PBXBuildFile; fileRef = 838A33712D06A97D00D0D770 /* librubberband.3.dylib */; };
839065F32853338700636FBB /* dsd2float.h in Headers */ = {isa = PBXBuildFile; fileRef = 839065F22853338700636FBB /* dsd2float.h */; };
839366671815923C006DD712 /* CogPluginMulti.h in Headers */ = {isa = PBXBuildFile; fileRef = 839366651815923C006DD712 /* CogPluginMulti.h */; };
839366681815923C006DD712 /* CogPluginMulti.m in Sources */ = {isa = PBXBuildFile; fileRef = 839366661815923C006DD712 /* CogPluginMulti.m */; };
8399CF2C27B5D1D5008751F1 /* NSDictionary+Merge.h in Headers */ = {isa = PBXBuildFile; fileRef = 8399CF2A27B5D1D4008751F1 /* NSDictionary+Merge.h */; };
8399CF2D27B5D1D5008751F1 /* NSDictionary+Merge.m in Sources */ = {isa = PBXBuildFile; fileRef = 8399CF2B27B5D1D4008751F1 /* NSDictionary+Merge.m */; };
839E56E52879450300DFB5F4 /* HrtfData.h in Headers */ = {isa = PBXBuildFile; fileRef = 839E56E12879450300DFB5F4 /* HrtfData.h */; };
839E56E62879450300DFB5F4 /* Endianness.h in Headers */ = {isa = PBXBuildFile; fileRef = 839E56E22879450300DFB5F4 /* Endianness.h */; };
839E56E72879450300DFB5F4 /* HrtfData.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 839E56E32879450300DFB5F4 /* HrtfData.cpp */; };
839E56E82879450300DFB5F4 /* IHrtfData.h in Headers */ = {isa = PBXBuildFile; fileRef = 839E56E42879450300DFB5F4 /* IHrtfData.h */; };
839E56EA28794F6300DFB5F4 /* HrtfTypes.h in Headers */ = {isa = PBXBuildFile; fileRef = 839E56E928794F6300DFB5F4 /* HrtfTypes.h */; };
839E56F7287974A100DFB5F4 /* SandboxBroker.h in Headers */ = {isa = PBXBuildFile; fileRef = 839E56F6287974A100DFB5F4 /* SandboxBroker.h */; };
839E899E2D5DB9D500A13526 /* VisualizationNode.h in Headers */ = {isa = PBXBuildFile; fileRef = 839E899D2D5DB9D500A13526 /* VisualizationNode.h */; settings = {ATTRIBUTES = (Public, ); }; };
839E89A02D5DBA1700A13526 /* VisualizationNode.m in Sources */ = {isa = PBXBuildFile; fileRef = 839E899F2D5DBA1700A13526 /* VisualizationNode.m */; };
83A3496A2D5C3F430096D530 /* DSPRubberbandNode.m in Sources */ = {isa = PBXBuildFile; fileRef = 83A349682D5C3F430096D530 /* DSPRubberbandNode.m */; };
83A3496B2D5C3F430096D530 /* DSPRubberbandNode.h in Headers */ = {isa = PBXBuildFile; fileRef = 83A349672D5C3F430096D530 /* DSPRubberbandNode.h */; settings = {ATTRIBUTES = (Public, ); }; };
83A3496D2D5C40490096D530 /* DSPFSurroundNode.h in Headers */ = {isa = PBXBuildFile; fileRef = 83A3496C2D5C40490096D530 /* DSPFSurroundNode.h */; settings = {ATTRIBUTES = (Public, ); }; };
83A3496F2D5C405E0096D530 /* DSPFSurroundNode.m in Sources */ = {isa = PBXBuildFile; fileRef = 83A3496E2D5C405E0096D530 /* DSPFSurroundNode.m */; };
83A349722D5C41810096D530 /* FSurroundFilter.mm in Sources */ = {isa = PBXBuildFile; fileRef = 83A349712D5C41810096D530 /* FSurroundFilter.mm */; };
83A349732D5C41810096D530 /* FSurroundFilter.h in Headers */ = {isa = PBXBuildFile; fileRef = 83A349702D5C41810096D530 /* FSurroundFilter.h */; settings = {ATTRIBUTES = (Public, ); }; };
83A349752D5C50A10096D530 /* DSPHRTFNode.h in Headers */ = {isa = PBXBuildFile; fileRef = 83A349742D5C50A10096D530 /* DSPHRTFNode.h */; settings = {ATTRIBUTES = (Public, ); }; };
83A349772D5C50B20096D530 /* DSPHRTFNode.m in Sources */ = {isa = PBXBuildFile; fileRef = 83A349762D5C50B20096D530 /* DSPHRTFNode.m */; };
83B74281289E027F005AAC28 /* CogAudio-Bridging-Header.h in Headers */ = {isa = PBXBuildFile; fileRef = 83B74280289E027F005AAC28 /* CogAudio-Bridging-Header.h */; };
83F843202D5C6272008C123B /* HeadphoneFilter.h in Headers */ = {isa = PBXBuildFile; fileRef = 83F8431E2D5C6272008C123B /* HeadphoneFilter.h */; settings = {ATTRIBUTES = (Public, ); }; };
83F843212D5C6272008C123B /* HeadphoneFilter.mm in Sources */ = {isa = PBXBuildFile; fileRef = 83F8431F2D5C6272008C123B /* HeadphoneFilter.mm */; };
83F843232D5C66DA008C123B /* DSPEqualizerNode.h in Headers */ = {isa = PBXBuildFile; fileRef = 83F843222D5C66DA008C123B /* DSPEqualizerNode.h */; settings = {ATTRIBUTES = (Public, ); }; };
83F843252D5C66E9008C123B /* DSPEqualizerNode.m in Sources */ = {isa = PBXBuildFile; fileRef = 83F843242D5C66E9008C123B /* DSPEqualizerNode.m */; };
83F9FFF62D6EC43900026576 /* soxr.h in Headers */ = {isa = PBXBuildFile; fileRef = 83F9FFF02D6EC43900026576 /* soxr.h */; settings = {ATTRIBUTES = (Public, ); }; };
83F9FFF82D6EC43900026576 /* libsoxr.0.dylib in Frameworks */ = {isa = PBXBuildFile; fileRef = 83F9FFF22D6EC43900026576 /* libsoxr.0.dylib */; };
83FFED512D5B08BC0044CCAF /* DSPNode.h in Headers */ = {isa = PBXBuildFile; fileRef = 83FFED502D5B08BC0044CCAF /* DSPNode.h */; settings = {ATTRIBUTES = (Public, ); }; };
83FFED532D5B09320044CCAF /* DSPNode.m in Sources */ = {isa = PBXBuildFile; fileRef = 83FFED522D5B09320044CCAF /* DSPNode.m */; };
839B83FA286D91ED00F529EE /* VisualizationController.swift in Sources */ = {isa = PBXBuildFile; fileRef = 839B83F9286D91ED00F529EE /* VisualizationController.swift */; };
8DC2EF570486A6940098B216 /* Cocoa.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 1058C7B1FEA5585E11CA2CBB /* Cocoa.framework */; };
8E8D3D2F0CBAEE6E00135C1B /* AudioContainer.h in Headers */ = {isa = PBXBuildFile; fileRef = 8E8D3D2D0CBAEE6E00135C1B /* AudioContainer.h */; settings = {ATTRIBUTES = (Public, ); }; };
8E8D3D300CBAEE6E00135C1B /* AudioContainer.m in Sources */ = {isa = PBXBuildFile; fileRef = 8E8D3D2E0CBAEE6E00135C1B /* AudioContainer.m */; };
8EC1225F0B993BD500C5B3AD /* ConverterNode.h in Headers */ = {isa = PBXBuildFile; fileRef = 8EC1225D0B993BD500C5B3AD /* ConverterNode.h */; settings = {ATTRIBUTES = (Public, ); }; };
8EC1225F0B993BD500C5B3AD /* ConverterNode.h in Headers */ = {isa = PBXBuildFile; fileRef = 8EC1225D0B993BD500C5B3AD /* ConverterNode.h */; };
8EC122600B993BD500C5B3AD /* ConverterNode.m in Sources */ = {isa = PBXBuildFile; fileRef = 8EC1225E0B993BD500C5B3AD /* ConverterNode.m */; };
B0575F2D0D687A0800411D77 /* Helper.h in Headers */ = {isa = PBXBuildFile; fileRef = B0575F2C0D687A0800411D77 /* Helper.h */; settings = {ATTRIBUTES = (Public, ); }; };
B0575F300D687A4000411D77 /* Helper.m in Sources */ = {isa = PBXBuildFile; fileRef = B0575F2F0D687A4000411D77 /* Helper.m */; };
/* End PBXBuildFile section */
/* Begin PBXCopyFilesBuildPhase section */
17D21D2B0B8BE6A200D1EBDE /* CopyFiles */ = {
isa = PBXCopyFilesBuildPhase;
buildActionMask = 2147483647;
dstPath = "";
dstSubfolderSpec = 10;
files = (
);
runOnlyForDeploymentPostprocessing = 0;
};
83725A8D27AA0DDB0003F694 /* CopyFiles */ = {
isa = PBXCopyFilesBuildPhase;
buildActionMask = 2147483647;
@ -152,9 +151,11 @@
17D21C7D0B8BE4BA00D1EBDE /* Node.m */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.c.objc; path = Node.m; sourceTree = "<group>"; };
17D21C7E0B8BE4BA00D1EBDE /* OutputNode.h */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.c.h; path = OutputNode.h; sourceTree = "<group>"; };
17D21C7F0B8BE4BA00D1EBDE /* OutputNode.m */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.c.objc; path = OutputNode.m; sourceTree = "<group>"; };
17D21C9C0B8BE4BA00D1EBDE /* OutputAVFoundation.h */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.c.h; path = OutputAVFoundation.h; sourceTree = "<group>"; };
17D21C9D0B8BE4BA00D1EBDE /* OutputAVFoundation.m */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.c.objc; path = OutputAVFoundation.m; sourceTree = "<group>"; };
17D21C9E0B8BE4BA00D1EBDE /* Status.h */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.c.h; path = Status.h; sourceTree = "<group>"; };
17D21CF10B8BE5EF00D1EBDE /* CogSemaphore.h */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.c.h; path = CogSemaphore.h; sourceTree = "<group>"; };
17D21CF20B8BE5EF00D1EBDE /* CogSemaphore.m */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.c.objc; path = CogSemaphore.m; sourceTree = "<group>"; };
17D21CF10B8BE5EF00D1EBDE /* Semaphore.h */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.c.h; path = Semaphore.h; sourceTree = "<group>"; };
17D21CF20B8BE5EF00D1EBDE /* Semaphore.m */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.c.objc; path = Semaphore.m; sourceTree = "<group>"; };
17D21DA90B8BE76800D1EBDE /* AudioToolbox.framework */ = {isa = PBXFileReference; lastKnownFileType = wrapper.framework; name = AudioToolbox.framework; path = /System/Library/Frameworks/AudioToolbox.framework; sourceTree = "<absolute>"; };
17D21DAA0B8BE76800D1EBDE /* AudioUnit.framework */ = {isa = PBXFileReference; lastKnownFileType = wrapper.framework; name = AudioUnit.framework; path = /System/Library/Frameworks/AudioUnit.framework; sourceTree = "<absolute>"; };
17D21DAB0B8BE76800D1EBDE /* CoreAudio.framework */ = {isa = PBXFileReference; lastKnownFileType = wrapper.framework; name = CoreAudio.framework; path = /System/Library/Frameworks/CoreAudio.framework; sourceTree = "<absolute>"; };
@ -167,40 +168,51 @@
17F94DD40B8D0F7000A34E87 /* PluginController.mm */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.cpp.objcpp; path = PluginController.mm; sourceTree = "<group>"; };
17F94DDC0B8D101100A34E87 /* Plugin.h */ = {isa = PBXFileReference; fileEncoding = 30; lastKnownFileType = sourcecode.c.h; path = Plugin.h; sourceTree = "<group>"; };
32DBCF5E0370ADEE00C91783 /* CogAudio_Prefix.pch */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CogAudio_Prefix.pch; sourceTree = "<group>"; };
831A50132865A7FD0049CFE4 /* rsstate.hpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.h; path = rsstate.hpp; sourceTree = "<group>"; };
831A50152865A8800049CFE4 /* rsstate.cpp */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = rsstate.cpp; sourceTree = "<group>"; };
831A50172865A8B30049CFE4 /* rsstate.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = rsstate.h; sourceTree = "<group>"; };
831A4F9D2865A7DC0049CFE4 /* CDSPProcessor.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CDSPProcessor.h; sourceTree = "<group>"; };
831A4F9E2865A7DC0049CFE4 /* CDSPRealFFT.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CDSPRealFFT.h; sourceTree = "<group>"; };
831A4FA02865A7DC0049CFE4 /* pffft_double.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = pffft_double.h; sourceTree = "<group>"; };
831A4FA22865A7DC0049CFE4 /* pf_neon_double_from_avx.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = pf_neon_double_from_avx.h; sourceTree = "<group>"; };
831A4FA32865A7DC0049CFE4 /* pf_double.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = pf_double.h; sourceTree = "<group>"; };
831A4FA42865A7DC0049CFE4 /* pf_neon_double.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = pf_neon_double.h; sourceTree = "<group>"; };
831A4FA52865A7DC0049CFE4 /* pf_sse2_double.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = pf_sse2_double.h; sourceTree = "<group>"; };
831A4FA62865A7DC0049CFE4 /* pf_avx_double.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = pf_avx_double.h; sourceTree = "<group>"; };
831A4FA72865A7DC0049CFE4 /* pf_scalar_double.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = pf_scalar_double.h; sourceTree = "<group>"; };
831A4FA82865A7DC0049CFE4 /* pffft_priv_impl.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = pffft_priv_impl.h; sourceTree = "<group>"; };
831A4FA92865A7DC0049CFE4 /* pffft_double.c */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.c; path = pffft_double.c; sourceTree = "<group>"; };
831A4FB72865A7DC0049CFE4 /* CDSPHBUpsampler.inc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.pascal; path = CDSPHBUpsampler.inc; sourceTree = "<group>"; };
831A4FB82865A7DC0049CFE4 /* r8butil.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = r8butil.h; sourceTree = "<group>"; };
831A4FBA2865A7DC0049CFE4 /* r8bbase.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = r8bbase.h; sourceTree = "<group>"; };
831A4FC42865A7DC0049CFE4 /* CDSPSincFilterGen.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CDSPSincFilterGen.h; sourceTree = "<group>"; };
831A4FD02865A7DC0049CFE4 /* LICENSE */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; path = LICENSE; sourceTree = "<group>"; };
831A4FD12865A7DC0049CFE4 /* CDSPResampler.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CDSPResampler.h; sourceTree = "<group>"; };
831A4FD22865A7DC0049CFE4 /* CDSPHBUpsampler.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CDSPHBUpsampler.h; sourceTree = "<group>"; };
831A4FD42865A7DC0049CFE4 /* CDSPBlockConvolver.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CDSPBlockConvolver.h; sourceTree = "<group>"; };
831A4FD52865A7DC0049CFE4 /* fft4g.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = fft4g.h; sourceTree = "<group>"; };
831A4FD62865A7DC0049CFE4 /* CDSPHBDownsampler.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CDSPHBDownsampler.h; sourceTree = "<group>"; };
831A4FD72865A7DC0049CFE4 /* r8bconf.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = r8bconf.h; sourceTree = "<group>"; };
831A4FD82865A7DC0049CFE4 /* CDSPFracInterpolator.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CDSPFracInterpolator.h; sourceTree = "<group>"; };
831A4FD92865A7DC0049CFE4 /* CDSPFIRFilter.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CDSPFIRFilter.h; sourceTree = "<group>"; };
831A4FDA2865A7DC0049CFE4 /* r8bbase.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = r8bbase.cpp; sourceTree = "<group>"; };
831A4FDB2865A7DC0049CFE4 /* pffft.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = pffft.h; sourceTree = "<group>"; };
831A50132865A7FD0049CFE4 /* r8bstate.hpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.h; path = r8bstate.hpp; sourceTree = "<group>"; };
831A50152865A8800049CFE4 /* r8bstate.cpp */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = r8bstate.cpp; sourceTree = "<group>"; };
831A50172865A8B30049CFE4 /* r8bstate.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = r8bstate.h; sourceTree = "<group>"; };
8328995127CB510F00D7F028 /* RedundantPlaylistDataStore.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; name = RedundantPlaylistDataStore.m; path = ../../Utils/RedundantPlaylistDataStore.m; sourceTree = "<group>"; };
8328995227CB511000D7F028 /* RedundantPlaylistDataStore.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = RedundantPlaylistDataStore.h; path = ../../Utils/RedundantPlaylistDataStore.h; sourceTree = "<group>"; };
8328995527CB51B700D7F028 /* SHA256Digest.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = SHA256Digest.h; path = ../../Utils/SHA256Digest.h; sourceTree = "<group>"; };
8328995627CB51B700D7F028 /* SHA256Digest.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; name = SHA256Digest.m; path = ../../Utils/SHA256Digest.m; sourceTree = "<group>"; };
8328995927CB51C900D7F028 /* Security.framework */ = {isa = PBXFileReference; lastKnownFileType = wrapper.framework; name = Security.framework; path = System/Library/Frameworks/Security.framework; sourceTree = SDKROOT; };
833442402D6EFA6700C51D38 /* VisualizationController.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = VisualizationController.h; sourceTree = "<group>"; };
833442412D6EFA6700C51D38 /* VisualizationController.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = VisualizationController.m; sourceTree = "<group>"; };
833738E92D5EA52500278628 /* DSPDownmixNode.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = DSPDownmixNode.h; sourceTree = "<group>"; };
833738EB2D5EA53500278628 /* DSPDownmixNode.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = DSPDownmixNode.m; sourceTree = "<group>"; };
833738ED2D5EA5B700278628 /* Downmix.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = Downmix.h; sourceTree = "<group>"; };
833738EE2D5EA5B700278628 /* Downmix.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = Downmix.m; sourceTree = "<group>"; };
8347C73F2796C58800FA8A7D /* NSFileHandle+CreateFile.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = "NSFileHandle+CreateFile.h"; path = "../../Utils/NSFileHandle+CreateFile.h"; sourceTree = "<group>"; };
8347C7402796C58800FA8A7D /* NSFileHandle+CreateFile.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; name = "NSFileHandle+CreateFile.m"; path = "../../Utils/NSFileHandle+CreateFile.m"; sourceTree = "<group>"; };
834A41A5287A90AB00EB9D9B /* freesurround_decoder.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = freesurround_decoder.h; sourceTree = "<group>"; };
834A41A6287A90AB00EB9D9B /* freesurround_decoder.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = freesurround_decoder.cpp; sourceTree = "<group>"; };
834A41A7287A90AB00EB9D9B /* channelmaps.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = channelmaps.cpp; sourceTree = "<group>"; };
834A41A8287A90AB00EB9D9B /* channelmaps.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = channelmaps.h; sourceTree = "<group>"; };
834FD4EA27AF8F380063BC83 /* AudioChunk.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = AudioChunk.h; sourceTree = "<group>"; };
834FD4EC27AF91220063BC83 /* AudioChunk.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = AudioChunk.m; sourceTree = "<group>"; };
834FD4EE27AF93680063BC83 /* ChunkList.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = ChunkList.h; sourceTree = "<group>"; };
834FD4EF27AF93680063BC83 /* ChunkList.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = ChunkList.m; sourceTree = "<group>"; };
83504163286447DA006B32CC /* Downmix.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = Downmix.h; sourceTree = "<group>"; };
83504164286447DA006B32CC /* Downmix.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = Downmix.m; sourceTree = "<group>"; };
8350416C28646149006B32CC /* CoreMedia.framework */ = {isa = PBXFileReference; lastKnownFileType = wrapper.framework; name = CoreMedia.framework; path = System/Library/Frameworks/CoreMedia.framework; sourceTree = SDKROOT; };
835C88AF279811A500E28EAE /* hdcd_decode2.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = hdcd_decode2.h; sourceTree = "<group>"; };
835C88B0279811A500E28EAE /* hdcd_decode2.c */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.c; path = hdcd_decode2.c; sourceTree = "<group>"; };
835DD2652ACAF1D90057E319 /* OutputCoreAudio.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = OutputCoreAudio.m; sourceTree = "<group>"; };
835DD2662ACAF1D90057E319 /* OutputCoreAudio.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = OutputCoreAudio.h; sourceTree = "<group>"; };
835DD26B2ACAF5AD0057E319 /* LICENSE.LGPL */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; path = LICENSE.LGPL; sourceTree = "<group>"; };
835DD26C2ACAF5AD0057E319 /* License.txt */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; path = License.txt; sourceTree = "<group>"; };
835DD26D2ACAF5AD0057E319 /* lpc.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = lpc.h; sourceTree = "<group>"; };
835DD26E2ACAF5AD0057E319 /* util.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = util.h; sourceTree = "<group>"; };
835DD26F2ACAF5AD0057E319 /* lpc.c */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.c; path = lpc.c; sourceTree = "<group>"; };
835FAC5C27BCA14D00BA8562 /* BadSampleCleaner.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; name = BadSampleCleaner.h; path = Utils/BadSampleCleaner.h; sourceTree = SOURCE_ROOT; };
835FAC5D27BCA14D00BA8562 /* BadSampleCleaner.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; name = BadSampleCleaner.m; path = Utils/BadSampleCleaner.m; sourceTree = SOURCE_ROOT; };
83725A7B27AA0D8A0003F694 /* Accelerate.framework */ = {isa = PBXFileReference; lastKnownFileType = wrapper.framework; name = Accelerate.framework; path = System/Library/Frameworks/Accelerate.framework; sourceTree = SDKROOT; };
@ -208,38 +220,12 @@
8377C64B27B8C51500E8BC0F /* fft_accelerate.c */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.c; path = fft_accelerate.c; sourceTree = "<group>"; };
8377C64D27B8C54400E8BC0F /* fft.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = fft.h; sourceTree = "<group>"; };
8384912618080FF100E7332D /* Logging.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = Logging.h; path = ../../Utils/Logging.h; sourceTree = "<group>"; };
838A33712D06A97D00D0D770 /* librubberband.3.dylib */ = {isa = PBXFileReference; lastKnownFileType = "compiled.mach-o.dylib"; name = librubberband.3.dylib; path = ../ThirdParty/rubberband/lib/librubberband.3.dylib; sourceTree = SOURCE_ROOT; };
839065F22853338700636FBB /* dsd2float.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = dsd2float.h; sourceTree = "<group>"; };
839366651815923C006DD712 /* CogPluginMulti.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CogPluginMulti.h; sourceTree = "<group>"; };
839366661815923C006DD712 /* CogPluginMulti.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = CogPluginMulti.m; sourceTree = "<group>"; };
8399CF2A27B5D1D4008751F1 /* NSDictionary+Merge.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = "NSDictionary+Merge.h"; path = "../../Utils/NSDictionary+Merge.h"; sourceTree = "<group>"; };
8399CF2B27B5D1D4008751F1 /* NSDictionary+Merge.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; name = "NSDictionary+Merge.m"; path = "../../Utils/NSDictionary+Merge.m"; sourceTree = "<group>"; };
839E56E12879450300DFB5F4 /* HrtfData.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = HrtfData.h; sourceTree = "<group>"; };
839E56E22879450300DFB5F4 /* Endianness.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = Endianness.h; sourceTree = "<group>"; };
839E56E32879450300DFB5F4 /* HrtfData.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = HrtfData.cpp; sourceTree = "<group>"; };
839E56E42879450300DFB5F4 /* IHrtfData.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = IHrtfData.h; sourceTree = "<group>"; };
839E56E928794F6300DFB5F4 /* HrtfTypes.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = HrtfTypes.h; sourceTree = "<group>"; };
839E56F6287974A100DFB5F4 /* SandboxBroker.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = SandboxBroker.h; path = ../Utils/SandboxBroker.h; sourceTree = "<group>"; };
839E899D2D5DB9D500A13526 /* VisualizationNode.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = VisualizationNode.h; sourceTree = "<group>"; };
839E899F2D5DBA1700A13526 /* VisualizationNode.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = VisualizationNode.m; sourceTree = "<group>"; };
83A349672D5C3F430096D530 /* DSPRubberbandNode.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = DSPRubberbandNode.h; sourceTree = "<group>"; };
83A349682D5C3F430096D530 /* DSPRubberbandNode.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = DSPRubberbandNode.m; sourceTree = "<group>"; };
83A3496C2D5C40490096D530 /* DSPFSurroundNode.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = DSPFSurroundNode.h; sourceTree = "<group>"; };
83A3496E2D5C405E0096D530 /* DSPFSurroundNode.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = DSPFSurroundNode.m; sourceTree = "<group>"; };
83A349702D5C41810096D530 /* FSurroundFilter.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = FSurroundFilter.h; sourceTree = "<group>"; };
83A349712D5C41810096D530 /* FSurroundFilter.mm */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.objcpp; path = FSurroundFilter.mm; sourceTree = "<group>"; };
83A349742D5C50A10096D530 /* DSPHRTFNode.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = DSPHRTFNode.h; sourceTree = "<group>"; };
83A349762D5C50B20096D530 /* DSPHRTFNode.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = DSPHRTFNode.m; sourceTree = "<group>"; };
83B74280289E027F005AAC28 /* CogAudio-Bridging-Header.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = "CogAudio-Bridging-Header.h"; sourceTree = "<group>"; };
83F8431E2D5C6272008C123B /* HeadphoneFilter.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = HeadphoneFilter.h; sourceTree = "<group>"; };
83F8431F2D5C6272008C123B /* HeadphoneFilter.mm */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.objcpp; path = HeadphoneFilter.mm; sourceTree = "<group>"; };
83F843222D5C66DA008C123B /* DSPEqualizerNode.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = DSPEqualizerNode.h; sourceTree = "<group>"; };
83F843242D5C66E9008C123B /* DSPEqualizerNode.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = DSPEqualizerNode.m; sourceTree = "<group>"; };
83F9FFF02D6EC43900026576 /* soxr.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = soxr.h; sourceTree = "<group>"; };
83F9FFF22D6EC43900026576 /* libsoxr.0.dylib */ = {isa = PBXFileReference; lastKnownFileType = "compiled.mach-o.dylib"; path = libsoxr.0.dylib; sourceTree = "<group>"; };
83F9FFF42D6EC43900026576 /* README.md */ = {isa = PBXFileReference; lastKnownFileType = net.daringfireball.markdown; path = README.md; sourceTree = "<group>"; };
83FFED502D5B08BC0044CCAF /* DSPNode.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = DSPNode.h; sourceTree = "<group>"; };
83FFED522D5B09320044CCAF /* DSPNode.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = DSPNode.m; sourceTree = "<group>"; };
839B83F9286D91ED00F529EE /* VisualizationController.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = VisualizationController.swift; sourceTree = "<group>"; };
8DC2EF5A0486A6940098B216 /* Info.plist */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.plist; path = Info.plist; sourceTree = "<group>"; };
8DC2EF5B0486A6940098B216 /* CogAudio.framework */ = {isa = PBXFileReference; explicitFileType = wrapper.framework; includeInIndex = 0; path = CogAudio.framework; sourceTree = BUILT_PRODUCTS_DIR; };
8E8D3D2D0CBAEE6E00135C1B /* AudioContainer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioContainer.h; sourceTree = "<group>"; };
@ -258,7 +244,6 @@
files = (
8328995A27CB51C900D7F028 /* Security.framework in Frameworks */,
83725A9127AA16D50003F694 /* AVFoundation.framework in Frameworks */,
83F9FFF82D6EC43900026576 /* libsoxr.0.dylib in Frameworks */,
8DC2EF570486A6940098B216 /* Cocoa.framework in Frameworks */,
8350416D28646149006B32CC /* CoreMedia.framework in Frameworks */,
83725A9027AA16C90003F694 /* Accelerate.framework in Frameworks */,
@ -266,7 +251,6 @@
17D21DAE0B8BE76800D1EBDE /* AudioUnit.framework in Frameworks */,
17D21DAF0B8BE76800D1EBDE /* CoreAudio.framework in Frameworks */,
17D21DB00B8BE76800D1EBDE /* CoreAudioKit.framework in Frameworks */,
838A33722D06A97D00D0D770 /* librubberband.3.dylib in Frameworks */,
);
runOnlyForDeploymentPostprocessing = 0;
};
@ -314,7 +298,6 @@
08FB77AEFE84172EC02AAC07 /* Classes */ = {
isa = PBXGroup;
children = (
83B74280289E027F005AAC28 /* CogAudio-Bridging-Header.h */,
8377C64F27B8CAAB00E8BC0F /* Visualization */,
17F94DDC0B8D101100A34E87 /* Plugin.h */,
17D21EBB0B8BF44000D1EBDE /* AudioPlayer.h */,
@ -337,7 +320,6 @@
17F94DD40B8D0F7000A34E87 /* PluginController.mm */,
17D21C750B8BE4BA00D1EBDE /* Chain */,
17D21C9B0B8BE4BA00D1EBDE /* Output */,
839E56F6287974A100DFB5F4 /* SandboxBroker.h */,
17D21C9E0B8BE4BA00D1EBDE /* Status.h */,
B0575F2C0D687A0800411D77 /* Helper.h */,
B0575F2F0D687A4000411D77 /* Helper.m */,
@ -358,7 +340,6 @@
1058C7B2FEA5585E11CA2CBB /* Other Frameworks */ = {
isa = PBXGroup;
children = (
838A33712D06A97D00D0D770 /* librubberband.3.dylib */,
83725A7B27AA0D8A0003F694 /* Accelerate.framework */,
17D21DAA0B8BE76800D1EBDE /* AudioUnit.framework */,
17D21DA90B8BE76800D1EBDE /* AudioToolbox.framework */,
@ -375,7 +356,6 @@
17D21C750B8BE4BA00D1EBDE /* Chain */ = {
isa = PBXGroup;
children = (
83A349692D5C3F430096D530 /* DSP */,
834FD4EA27AF8F380063BC83 /* AudioChunk.h */,
834FD4EC27AF91220063BC83 /* AudioChunk.m */,
17D21C760B8BE4BA00D1EBDE /* BufferChain.h */,
@ -384,16 +364,14 @@
834FD4EF27AF93680063BC83 /* ChunkList.m */,
8EC1225D0B993BD500C5B3AD /* ConverterNode.h */,
8EC1225E0B993BD500C5B3AD /* ConverterNode.m */,
83504163286447DA006B32CC /* Downmix.h */,
83504164286447DA006B32CC /* Downmix.m */,
17D21C7A0B8BE4BA00D1EBDE /* InputNode.h */,
17D21C7B0B8BE4BA00D1EBDE /* InputNode.m */,
17D21C7C0B8BE4BA00D1EBDE /* Node.h */,
17D21C7D0B8BE4BA00D1EBDE /* Node.m */,
17D21C7E0B8BE4BA00D1EBDE /* OutputNode.h */,
17D21C7F0B8BE4BA00D1EBDE /* OutputNode.m */,
83FFED502D5B08BC0044CCAF /* DSPNode.h */,
83FFED522D5B09320044CCAF /* DSPNode.m */,
839E899D2D5DB9D500A13526 /* VisualizationNode.h */,
839E899F2D5DBA1700A13526 /* VisualizationNode.m */,
);
path = Chain;
sourceTree = "<group>";
@ -401,8 +379,8 @@
17D21C9B0B8BE4BA00D1EBDE /* Output */ = {
isa = PBXGroup;
children = (
835DD2662ACAF1D90057E319 /* OutputCoreAudio.h */,
835DD2652ACAF1D90057E319 /* OutputCoreAudio.m */,
17D21C9C0B8BE4BA00D1EBDE /* OutputAVFoundation.h */,
17D21C9D0B8BE4BA00D1EBDE /* OutputAVFoundation.m */,
);
path = Output;
sourceTree = "<group>";
@ -410,13 +388,10 @@
17D21CD80B8BE5B400D1EBDE /* ThirdParty */ = {
isa = PBXGroup;
children = (
83F9FFF52D6EC43900026576 /* soxr */,
835DD2692ACAF5AD0057E319 /* lvqcl */,
834A41A4287A90AB00EB9D9B /* fsurround */,
839E56E02879450300DFB5F4 /* hrtf */,
831A50152865A8800049CFE4 /* rsstate.cpp */,
831A50172865A8B30049CFE4 /* rsstate.h */,
831A50132865A7FD0049CFE4 /* rsstate.hpp */,
831A50152865A8800049CFE4 /* r8bstate.cpp */,
831A50172865A8B30049CFE4 /* r8bstate.h */,
831A50132865A7FD0049CFE4 /* r8bstate.hpp */,
831A4F9C2865A7DC0049CFE4 /* r8brain-free-src */,
8377C64A27B8C51500E8BC0F /* deadbeef */,
835C88AE279811A500E28EAE /* hdcd */,
17D21DC40B8BE79700D1EBDE /* CoreAudioUtils */,
@ -439,8 +414,8 @@
8347C73F2796C58800FA8A7D /* NSFileHandle+CreateFile.h */,
8347C7402796C58800FA8A7D /* NSFileHandle+CreateFile.m */,
8384912618080FF100E7332D /* Logging.h */,
17D21CF10B8BE5EF00D1EBDE /* CogSemaphore.h */,
17D21CF20B8BE5EF00D1EBDE /* CogSemaphore.m */,
17D21CF10B8BE5EF00D1EBDE /* Semaphore.h */,
17D21CF20B8BE5EF00D1EBDE /* Semaphore.m */,
);
path = Utils;
sourceTree = "<group>";
@ -462,15 +437,53 @@
name = "Other Sources";
sourceTree = "<group>";
};
834A41A4287A90AB00EB9D9B /* fsurround */ = {
831A4F9C2865A7DC0049CFE4 /* r8brain-free-src */ = {
isa = PBXGroup;
children = (
834A41A5287A90AB00EB9D9B /* freesurround_decoder.h */,
834A41A6287A90AB00EB9D9B /* freesurround_decoder.cpp */,
834A41A7287A90AB00EB9D9B /* channelmaps.cpp */,
834A41A8287A90AB00EB9D9B /* channelmaps.h */,
831A4F9D2865A7DC0049CFE4 /* CDSPProcessor.h */,
831A4F9E2865A7DC0049CFE4 /* CDSPRealFFT.h */,
831A4F9F2865A7DC0049CFE4 /* pffft_double */,
831A4FB72865A7DC0049CFE4 /* CDSPHBUpsampler.inc */,
831A4FB82865A7DC0049CFE4 /* r8butil.h */,
831A4FBA2865A7DC0049CFE4 /* r8bbase.h */,
831A4FC42865A7DC0049CFE4 /* CDSPSincFilterGen.h */,
831A4FD02865A7DC0049CFE4 /* LICENSE */,
831A4FD12865A7DC0049CFE4 /* CDSPResampler.h */,
831A4FD22865A7DC0049CFE4 /* CDSPHBUpsampler.h */,
831A4FD42865A7DC0049CFE4 /* CDSPBlockConvolver.h */,
831A4FD52865A7DC0049CFE4 /* fft4g.h */,
831A4FD62865A7DC0049CFE4 /* CDSPHBDownsampler.h */,
831A4FD72865A7DC0049CFE4 /* r8bconf.h */,
831A4FD82865A7DC0049CFE4 /* CDSPFracInterpolator.h */,
831A4FD92865A7DC0049CFE4 /* CDSPFIRFilter.h */,
831A4FDA2865A7DC0049CFE4 /* r8bbase.cpp */,
831A4FDB2865A7DC0049CFE4 /* pffft.h */,
);
path = fsurround;
path = "r8brain-free-src";
sourceTree = "<group>";
};
831A4F9F2865A7DC0049CFE4 /* pffft_double */ = {
isa = PBXGroup;
children = (
831A4FA02865A7DC0049CFE4 /* pffft_double.h */,
831A4FA12865A7DC0049CFE4 /* simd */,
831A4FA82865A7DC0049CFE4 /* pffft_priv_impl.h */,
831A4FA92865A7DC0049CFE4 /* pffft_double.c */,
);
path = pffft_double;
sourceTree = "<group>";
};
831A4FA12865A7DC0049CFE4 /* simd */ = {
isa = PBXGroup;
children = (
831A4FA22865A7DC0049CFE4 /* pf_neon_double_from_avx.h */,
831A4FA32865A7DC0049CFE4 /* pf_double.h */,
831A4FA42865A7DC0049CFE4 /* pf_neon_double.h */,
831A4FA52865A7DC0049CFE4 /* pf_sse2_double.h */,
831A4FA62865A7DC0049CFE4 /* pf_avx_double.h */,
831A4FA72865A7DC0049CFE4 /* pf_scalar_double.h */,
);
path = simd;
sourceTree = "<group>";
};
835C88AE279811A500E28EAE /* hdcd */ = {
@ -482,26 +495,6 @@
path = hdcd;
sourceTree = "<group>";
};
835DD2692ACAF5AD0057E319 /* lvqcl */ = {
isa = PBXGroup;
children = (
835DD26A2ACAF5AD0057E319 /* License */,
835DD26D2ACAF5AD0057E319 /* lpc.h */,
835DD26E2ACAF5AD0057E319 /* util.h */,
835DD26F2ACAF5AD0057E319 /* lpc.c */,
);
path = lvqcl;
sourceTree = "<group>";
};
835DD26A2ACAF5AD0057E319 /* License */ = {
isa = PBXGroup;
children = (
835DD26B2ACAF5AD0057E319 /* LICENSE.LGPL */,
835DD26C2ACAF5AD0057E319 /* License.txt */,
);
path = License;
sourceTree = "<group>";
};
83725A8F27AA16C90003F694 /* Frameworks */ = {
isa = PBXGroup;
children = (
@ -523,74 +516,11 @@
8377C64F27B8CAAB00E8BC0F /* Visualization */ = {
isa = PBXGroup;
children = (
833442402D6EFA6700C51D38 /* VisualizationController.h */,
833442412D6EFA6700C51D38 /* VisualizationController.m */,
839B83F9286D91ED00F529EE /* VisualizationController.swift */,
);
path = Visualization;
sourceTree = "<group>";
};
839E56E02879450300DFB5F4 /* hrtf */ = {
isa = PBXGroup;
children = (
839E56E22879450300DFB5F4 /* Endianness.h */,
839E56E32879450300DFB5F4 /* HrtfData.cpp */,
839E56E12879450300DFB5F4 /* HrtfData.h */,
839E56E928794F6300DFB5F4 /* HrtfTypes.h */,
839E56E42879450300DFB5F4 /* IHrtfData.h */,
);
path = hrtf;
sourceTree = "<group>";
};
83A349692D5C3F430096D530 /* DSP */ = {
isa = PBXGroup;
children = (
833738ED2D5EA5B700278628 /* Downmix.h */,
833738EE2D5EA5B700278628 /* Downmix.m */,
83F8431E2D5C6272008C123B /* HeadphoneFilter.h */,
83F8431F2D5C6272008C123B /* HeadphoneFilter.mm */,
83A349702D5C41810096D530 /* FSurroundFilter.h */,
83A349712D5C41810096D530 /* FSurroundFilter.mm */,
83A349672D5C3F430096D530 /* DSPRubberbandNode.h */,
83A349682D5C3F430096D530 /* DSPRubberbandNode.m */,
83A3496C2D5C40490096D530 /* DSPFSurroundNode.h */,
83A3496E2D5C405E0096D530 /* DSPFSurroundNode.m */,
83A349742D5C50A10096D530 /* DSPHRTFNode.h */,
83A349762D5C50B20096D530 /* DSPHRTFNode.m */,
83F843222D5C66DA008C123B /* DSPEqualizerNode.h */,
83F843242D5C66E9008C123B /* DSPEqualizerNode.m */,
833738E92D5EA52500278628 /* DSPDownmixNode.h */,
833738EB2D5EA53500278628 /* DSPDownmixNode.m */,
);
path = DSP;
sourceTree = "<group>";
};
83F9FFF12D6EC43900026576 /* include */ = {
isa = PBXGroup;
children = (
83F9FFF02D6EC43900026576 /* soxr.h */,
);
path = include;
sourceTree = "<group>";
};
83F9FFF32D6EC43900026576 /* lib */ = {
isa = PBXGroup;
children = (
83F9FFF22D6EC43900026576 /* libsoxr.0.dylib */,
);
path = lib;
sourceTree = "<group>";
};
83F9FFF52D6EC43900026576 /* soxr */ = {
isa = PBXGroup;
children = (
83F9FFF12D6EC43900026576 /* include */,
83F9FFF32D6EC43900026576 /* lib */,
83F9FFF42D6EC43900026576 /* README.md */,
);
name = soxr;
path = ../ThirdParty/soxr;
sourceTree = SOURCE_ROOT;
};
/* End PBXGroup section */
/* Begin PBXHeadersBuildPhase section */
@ -598,71 +528,73 @@
isa = PBXHeadersBuildPhase;
buildActionMask = 2147483647;
files = (
833442422D6EFA6700C51D38 /* VisualizationController.h in Headers */,
833738F02D5EA5B700278628 /* Downmix.h in Headers */,
834FD4EB27AF8F380063BC83 /* AudioChunk.h in Headers */,
83F843202D5C6272008C123B /* HeadphoneFilter.h in Headers */,
83A349732D5C41810096D530 /* FSurroundFilter.h in Headers */,
839E56E82879450300DFB5F4 /* IHrtfData.h in Headers */,
17D21CA10B8BE4BA00D1EBDE /* BufferChain.h in Headers */,
831A50142865A7FD0049CFE4 /* rsstate.hpp in Headers */,
835DD2682ACAF1D90057E319 /* OutputCoreAudio.h in Headers */,
834A41AC287A90AB00EB9D9B /* channelmaps.h in Headers */,
83A3496D2D5C40490096D530 /* DSPFSurroundNode.h in Headers */,
83A3496B2D5C3F430096D530 /* DSPRubberbandNode.h in Headers */,
831A4FE02865A7DC0049CFE4 /* pf_double.h in Headers */,
831A50142865A7FD0049CFE4 /* r8bstate.hpp in Headers */,
17D21CA50B8BE4BA00D1EBDE /* InputNode.h in Headers */,
833738EA2D5EA52500278628 /* DSPDownmixNode.h in Headers */,
83F843232D5C66DA008C123B /* DSPEqualizerNode.h in Headers */,
834A41A9287A90AB00EB9D9B /* freesurround_decoder.h in Headers */,
834FD4F027AF93680063BC83 /* ChunkList.h in Headers */,
835DD2732ACAF5AD0057E319 /* util.h in Headers */,
17D21CA70B8BE4BA00D1EBDE /* Node.h in Headers */,
8399CF2C27B5D1D5008751F1 /* NSDictionary+Merge.h in Headers */,
831A4FF32865A7DC0049CFE4 /* r8butil.h in Headers */,
17D21CA90B8BE4BA00D1EBDE /* OutputNode.h in Headers */,
8EC1225F0B993BD500C5B3AD /* ConverterNode.h in Headers */,
8328995427CB511000D7F028 /* RedundantPlaylistDataStore.h in Headers */,
839E56E52879450300DFB5F4 /* HrtfData.h in Headers */,
83FFED512D5B08BC0044CCAF /* DSPNode.h in Headers */,
839E899E2D5DB9D500A13526 /* VisualizationNode.h in Headers */,
83A349752D5C50A10096D530 /* DSPHRTFNode.h in Headers */,
83F9FFF62D6EC43900026576 /* soxr.h in Headers */,
831A50102865A7DC0049CFE4 /* CDSPFIRFilter.h in Headers */,
831A4FFE2865A7DC0049CFE4 /* CDSPSincFilterGen.h in Headers */,
831A4FF52865A7DC0049CFE4 /* r8bbase.h in Headers */,
831A50082865A7DC0049CFE4 /* CDSPResampler.h in Headers */,
17D21CC50B8BE4BA00D1EBDE /* OutputAVFoundation.h in Headers */,
83504165286447DA006B32CC /* Downmix.h in Headers */,
831A4FDE2865A7DC0049CFE4 /* pffft_double.h in Headers */,
831A4FE12865A7DC0049CFE4 /* pf_neon_double.h in Headers */,
17D21CC70B8BE4BA00D1EBDE /* Status.h in Headers */,
17D21CF30B8BE5EF00D1EBDE /* CogSemaphore.h in Headers */,
839E56E62879450300DFB5F4 /* Endianness.h in Headers */,
17D21CF30B8BE5EF00D1EBDE /* Semaphore.h in Headers */,
17D21DC70B8BE79700D1EBDE /* CoreAudioUtils.h in Headers */,
835DD2722ACAF5AD0057E319 /* lpc.h in Headers */,
17D21EBD0B8BF44000D1EBDE /* AudioPlayer.h in Headers */,
831A50182865A8B30049CFE4 /* rsstate.h in Headers */,
831A50182865A8B30049CFE4 /* r8bstate.h in Headers */,
831A4FE52865A7DC0049CFE4 /* pffft_priv_impl.h in Headers */,
831A4FE42865A7DC0049CFE4 /* pf_scalar_double.h in Headers */,
831A500D2865A7DC0049CFE4 /* CDSPHBDownsampler.h in Headers */,
831A500C2865A7DC0049CFE4 /* fft4g.h in Headers */,
834FD4F027AF93680063BC83 /* ChunkList.h in Headers */,
17F94DD50B8D0F7000A34E87 /* PluginController.h in Headers */,
831A50122865A7DC0049CFE4 /* pffft.h in Headers */,
831A4FDC2865A7DC0049CFE4 /* CDSPProcessor.h in Headers */,
831A4FDD2865A7DC0049CFE4 /* CDSPRealFFT.h in Headers */,
17F94DDD0B8D101100A34E87 /* Plugin.h in Headers */,
8328995727CB51B700D7F028 /* SHA256Digest.h in Headers */,
834FD4EB27AF8F380063BC83 /* AudioChunk.h in Headers */,
17A2D3C50B8D1D37000778C4 /* AudioDecoder.h in Headers */,
831A50092865A7DC0049CFE4 /* CDSPHBUpsampler.h in Headers */,
831A500E2865A7DC0049CFE4 /* r8bconf.h in Headers */,
8347C7412796C58800FA8A7D /* NSFileHandle+CreateFile.h in Headers */,
83B74281289E027F005AAC28 /* CogAudio-Bridging-Header.h in Headers */,
17C940230B900909008627D6 /* AudioMetadataReader.h in Headers */,
839E56F7287974A100DFB5F4 /* SandboxBroker.h in Headers */,
831A500F2865A7DC0049CFE4 /* CDSPFracInterpolator.h in Headers */,
831A4FE22865A7DC0049CFE4 /* pf_sse2_double.h in Headers */,
839065F32853338700636FBB /* dsd2float.h in Headers */,
17B619300B909BC300BC003F /* AudioPropertiesReader.h in Headers */,
831A4FDF2865A7DC0049CFE4 /* pf_neon_double_from_avx.h in Headers */,
831A4FE32865A7DC0049CFE4 /* pf_avx_double.h in Headers */,
839366671815923C006DD712 /* CogPluginMulti.h in Headers */,
17ADB13C0B97926D00257CA2 /* AudioSource.h in Headers */,
831A500B2865A7DC0049CFE4 /* CDSPBlockConvolver.h in Headers */,
835C88B1279811A500E28EAE /* hdcd_decode2.h in Headers */,
8EC1225F0B993BD500C5B3AD /* ConverterNode.h in Headers */,
8384912718080FF100E7332D /* Logging.h in Headers */,
8377C64E27B8C54400E8BC0F /* fft.h in Headers */,
835FAC5E27BCA14D00BA8562 /* BadSampleCleaner.h in Headers */,
8E8D3D2F0CBAEE6E00135C1B /* AudioContainer.h in Headers */,
B0575F2D0D687A0800411D77 /* Helper.h in Headers */,
07DB5F3E0ED353A900C2E3EF /* AudioMetadataWriter.h in Headers */,
839E56EA28794F6300DFB5F4 /* HrtfTypes.h in Headers */,
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXHeadersBuildPhase section */
/* Begin PBXNativeTarget section */
8DC2EF4F0486A6940098B216 /* CogAudio */ = {
8DC2EF4F0486A6940098B216 /* CogAudio Framework */ = {
isa = PBXNativeTarget;
buildConfigurationList = 1DEB91AD08733DA50010E9CD /* Build configuration list for PBXNativeTarget "CogAudio" */;
buildConfigurationList = 1DEB91AD08733DA50010E9CD /* Build configuration list for PBXNativeTarget "CogAudio Framework" */;
buildPhases = (
17D21D2B0B8BE6A200D1EBDE /* CopyFiles */,
8DC2EF500486A6940098B216 /* Headers */,
8DC2EF540486A6940098B216 /* Sources */,
8DC2EF560486A6940098B216 /* Frameworks */,
@ -673,7 +605,7 @@
);
dependencies = (
);
name = CogAudio;
name = "CogAudio Framework";
productInstallPath = "$(HOME)/Library/Frameworks";
productName = CogAudio;
productReference = 8DC2EF5B0486A6940098B216 /* CogAudio.framework */;
@ -685,8 +617,7 @@
0867D690FE84028FC02AAC07 /* Project object */ = {
isa = PBXProject;
attributes = {
BuildIndependentTargetsInParallel = YES;
LastUpgradeCheck = 1620;
LastUpgradeCheck = 1400;
TargetAttributes = {
8DC2EF4F0486A6940098B216 = {
LastSwiftMigration = 1330;
@ -707,7 +638,7 @@
projectDirPath = "";
projectRoot = "";
targets = (
8DC2EF4F0486A6940098B216 /* CogAudio */,
8DC2EF4F0486A6940098B216 /* CogAudio Framework */,
);
};
/* End PBXProject section */
@ -717,6 +648,7 @@
isa = PBXResourcesBuildPhase;
buildActionMask = 2147483647;
files = (
831A50072865A7DC0049CFE4 /* LICENSE in Resources */,
);
runOnlyForDeploymentPostprocessing = 0;
};
@ -728,47 +660,37 @@
buildActionMask = 2147483647;
files = (
17D21CA20B8BE4BA00D1EBDE /* BufferChain.m in Sources */,
83A349772D5C50B20096D530 /* DSPHRTFNode.m in Sources */,
17D21CA60B8BE4BA00D1EBDE /* InputNode.m in Sources */,
83A3496A2D5C3F430096D530 /* DSPRubberbandNode.m in Sources */,
831A50112865A7DC0049CFE4 /* r8bbase.cpp in Sources */,
83504166286447DA006B32CC /* Downmix.m in Sources */,
8399CF2D27B5D1D5008751F1 /* NSDictionary+Merge.m in Sources */,
83F843252D5C66E9008C123B /* DSPEqualizerNode.m in Sources */,
834A41AB287A90AB00EB9D9B /* channelmaps.cpp in Sources */,
833738EC2D5EA53500278628 /* DSPDownmixNode.m in Sources */,
833442432D6EFA6700C51D38 /* VisualizationController.m in Sources */,
831A50162865A8800049CFE4 /* rsstate.cpp in Sources */,
831A50162865A8800049CFE4 /* r8bstate.cpp in Sources */,
17D21CA80B8BE4BA00D1EBDE /* Node.m in Sources */,
17D21CAA0B8BE4BA00D1EBDE /* OutputNode.m in Sources */,
17D21CC60B8BE4BA00D1EBDE /* OutputAVFoundation.m in Sources */,
835C88B2279811A500E28EAE /* hdcd_decode2.c in Sources */,
835FAC5F27BCA14D00BA8562 /* BadSampleCleaner.m in Sources */,
834FD4ED27AF91220063BC83 /* AudioChunk.m in Sources */,
833738EF2D5EA5B700278628 /* Downmix.m in Sources */,
17D21CF40B8BE5EF00D1EBDE /* CogSemaphore.m in Sources */,
839E89A02D5DBA1700A13526 /* VisualizationNode.m in Sources */,
17D21CF40B8BE5EF00D1EBDE /* Semaphore.m in Sources */,
839B83FA286D91ED00F529EE /* VisualizationController.swift in Sources */,
8347C7422796C58800FA8A7D /* NSFileHandle+CreateFile.m in Sources */,
83A3496F2D5C405E0096D530 /* DSPFSurroundNode.m in Sources */,
17D21DC80B8BE79700D1EBDE /* CoreAudioUtils.m in Sources */,
8328995327CB511000D7F028 /* RedundantPlaylistDataStore.m in Sources */,
8377C64C27B8C51500E8BC0F /* fft_accelerate.c in Sources */,
839366681815923C006DD712 /* CogPluginMulti.m in Sources */,
17D21EBE0B8BF44000D1EBDE /* AudioPlayer.m in Sources */,
17F94DD60B8D0F7000A34E87 /* PluginController.mm in Sources */,
839E56E72879450300DFB5F4 /* HrtfData.cpp in Sources */,
831A4FE62865A7DC0049CFE4 /* pffft_double.c in Sources */,
831A4FF22865A7DC0049CFE4 /* CDSPHBUpsampler.inc in Sources */,
17A2D3C60B8D1D37000778C4 /* AudioDecoder.m in Sources */,
8328995827CB51B700D7F028 /* SHA256Digest.m in Sources */,
17C940240B900909008627D6 /* AudioMetadataReader.m in Sources */,
17B619310B909BC300BC003F /* AudioPropertiesReader.m in Sources */,
83F843212D5C6272008C123B /* HeadphoneFilter.mm in Sources */,
17ADB13D0B97926D00257CA2 /* AudioSource.m in Sources */,
834FD4F127AF93680063BC83 /* ChunkList.m in Sources */,
83FFED532D5B09320044CCAF /* DSPNode.m in Sources */,
8EC122600B993BD500C5B3AD /* ConverterNode.m in Sources */,
835DD2672ACAF1D90057E319 /* OutputCoreAudio.m in Sources */,
83A349722D5C41810096D530 /* FSurroundFilter.mm in Sources */,
8E8D3D300CBAEE6E00135C1B /* AudioContainer.m in Sources */,
B0575F300D687A4000411D77 /* Helper.m in Sources */,
835DD2742ACAF5AD0057E319 /* lpc.c in Sources */,
834A41AA287A90AB00EB9D9B /* freesurround_decoder.cpp in Sources */,
07DB5F3F0ED353A900C2E3EF /* AudioMetadataWriter.m in Sources */,
);
runOnlyForDeploymentPostprocessing = 0;
@ -779,33 +701,28 @@
1DEB91AE08733DA50010E9CD /* Debug */ = {
isa = XCBuildConfiguration;
buildSettings = {
CLANG_CXX_LANGUAGE_STANDARD = "c++17";
CLANG_ENABLE_MODULES = YES;
COMBINE_HIDPI_IMAGES = YES;
COPY_PHASE_STRIP = NO;
DEAD_CODE_STRIPPING = YES;
DEFINES_MODULE = YES;
DYLIB_COMPATIBILITY_VERSION = 1;
DYLIB_CURRENT_VERSION = 1;
ENABLE_MODULE_VERIFIER = YES;
FRAMEWORK_VERSION = A;
GCC_DYNAMIC_NO_PIC = NO;
GCC_ENABLE_OBJC_EXCEPTIONS = YES;
GCC_OPTIMIZATION_LEVEL = 0;
GCC_PRECOMPILE_PREFIX_HEADER = YES;
GCC_PREFIX_HEADER = CogAudio_Prefix.pch;
GCC_PREPROCESSOR_DEFINITIONS = "DEBUG=1";
HEADER_SEARCH_PATHS = (
../ThirdParty/soxr/include,
../ThirdParty/rubberband/include,
GCC_PREPROCESSOR_DEFINITIONS = (
"DEBUG=1",
"R8B_EXTFFT=1",
"R8B_PFFFT_DOUBLE=1",
);
INFOPLIST_FILE = Info.plist;
INSTALL_PATH = "@executable_path/../Frameworks";
LD_RUNPATH_SEARCH_PATHS = "@loader_path/Frameworks";
LIBRARY_SEARCH_PATHS = (
../ThirdParty/soxr/lib,
../ThirdParty/rubberband/lib,
);
MODULE_VERIFIER_SUPPORTED_LANGUAGE_STANDARDS = "gnu17 c++17";
LIBRARY_SEARCH_PATHS = "$(inherited)";
OTHER_LDFLAGS = "";
PRODUCT_BUNDLE_IDENTIFIER = org.cogx.cogaudio;
PRODUCT_NAME = CogAudio;
@ -820,30 +737,24 @@
1DEB91AF08733DA50010E9CD /* Release */ = {
isa = XCBuildConfiguration;
buildSettings = {
CLANG_CXX_LANGUAGE_STANDARD = "c++17";
CLANG_ENABLE_MODULES = YES;
COMBINE_HIDPI_IMAGES = YES;
DEAD_CODE_STRIPPING = YES;
DEFINES_MODULE = YES;
DYLIB_COMPATIBILITY_VERSION = 1;
DYLIB_CURRENT_VERSION = 1;
ENABLE_MODULE_VERIFIER = YES;
FRAMEWORK_VERSION = A;
GCC_ENABLE_OBJC_EXCEPTIONS = YES;
GCC_PRECOMPILE_PREFIX_HEADER = YES;
GCC_PREFIX_HEADER = CogAudio_Prefix.pch;
GCC_PREPROCESSOR_DEFINITIONS = "";
HEADER_SEARCH_PATHS = (
../ThirdParty/soxr/include,
../ThirdParty/rubberband/include,
GCC_PREPROCESSOR_DEFINITIONS = (
"R8B_EXTFFT=1",
"R8B_PFFFT_DOUBLE=1",
);
INFOPLIST_FILE = Info.plist;
INSTALL_PATH = "@executable_path/../Frameworks";
LD_RUNPATH_SEARCH_PATHS = "@loader_path/Frameworks";
LIBRARY_SEARCH_PATHS = (
../ThirdParty/soxr/lib,
../ThirdParty/rubberband/lib,
);
MODULE_VERIFIER_SUPPORTED_LANGUAGE_STANDARDS = "gnu17 c++17";
LIBRARY_SEARCH_PATHS = "$(inherited)";
OTHER_LDFLAGS = "";
PRODUCT_BUNDLE_IDENTIFIER = org.cogx.cogaudio;
PRODUCT_NAME = CogAudio;
@ -859,7 +770,6 @@
buildSettings = {
ALWAYS_SEARCH_USER_PATHS = NO;
CLANG_ANALYZER_LOCALIZABILITY_NONLOCALIZED = YES;
CLANG_ENABLE_MODULES = YES;
CLANG_ENABLE_OBJC_ARC = YES;
CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES;
CLANG_WARN_BOOL_CONVERSION = YES;
@ -881,10 +791,8 @@
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;
COPY_PHASE_STRIP = NO;
DEAD_CODE_STRIPPING = YES;
DEFINES_MODULE = YES;
ENABLE_STRICT_OBJC_MSGSEND = YES;
ENABLE_TESTABILITY = YES;
ENABLE_USER_SCRIPT_SANDBOXING = YES;
GCC_NO_COMMON_BLOCKS = YES;
GCC_PREPROCESSOR_DEFINITIONS = (
"DEBUG=1",
@ -898,11 +806,7 @@
GCC_WARN_UNUSED_VARIABLE = YES;
MACOSX_DEPLOYMENT_TARGET = 10.13;
ONLY_ACTIVE_ARCH = YES;
OTHER_CFLAGS = "-Wframe-larger-than=4000";
OTHER_CPLUSPLUSFLAGS = "-Wframe-larger-than=16000";
PRODUCT_MODULE_NAME = CogAudio;
SDKROOT = macosx;
SWIFT_OBJC_BRIDGING_HEADER = "CogAudio-Bridging-Header.h";
SYMROOT = ../build;
};
name = Debug;
@ -912,7 +816,6 @@
buildSettings = {
ALWAYS_SEARCH_USER_PATHS = NO;
CLANG_ANALYZER_LOCALIZABILITY_NONLOCALIZED = YES;
CLANG_ENABLE_MODULES = YES;
CLANG_ENABLE_OBJC_ARC = YES;
CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES;
CLANG_WARN_BOOL_CONVERSION = YES;
@ -934,9 +837,7 @@
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;
DEAD_CODE_STRIPPING = YES;
DEBUG_INFORMATION_FORMAT = "dwarf-with-dsym";
DEFINES_MODULE = YES;
ENABLE_STRICT_OBJC_MSGSEND = YES;
ENABLE_USER_SCRIPT_SANDBOXING = YES;
GCC_NO_COMMON_BLOCKS = YES;
GCC_PREPROCESSOR_DEFINITIONS = "HAVE_CONFIG_H=1";
GCC_WARN_64_TO_32_BIT_CONVERSION = YES;
@ -946,12 +847,7 @@
GCC_WARN_UNUSED_FUNCTION = YES;
GCC_WARN_UNUSED_VARIABLE = YES;
MACOSX_DEPLOYMENT_TARGET = 10.13;
OTHER_CFLAGS = "-Wframe-larger-than=4000";
OTHER_CPLUSPLUSFLAGS = "-Wframe-larger-than=16000";
PRODUCT_MODULE_NAME = CogAudio;
SDKROOT = macosx;
SWIFT_COMPILATION_MODE = wholemodule;
SWIFT_OBJC_BRIDGING_HEADER = "CogAudio-Bridging-Header.h";
SYMROOT = ../build;
};
name = Release;
@ -959,7 +855,7 @@
/* End XCBuildConfiguration section */
/* Begin XCConfigurationList section */
1DEB91AD08733DA50010E9CD /* Build configuration list for PBXNativeTarget "CogAudio" */ = {
1DEB91AD08733DA50010E9CD /* Build configuration list for PBXNativeTarget "CogAudio Framework" */ = {
isa = XCConfigurationList;
buildConfigurations = (
1DEB91AE08733DA50010E9CD /* Debug */,

View file

@ -1,6 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<Scheme
LastUpgradeVersion = "1620"
LastUpgradeVersion = "1400"
version = "1.3">
<BuildAction
parallelizeBuildables = "YES"
@ -16,7 +16,7 @@
BuildableIdentifier = "primary"
BlueprintIdentifier = "8DC2EF4F0486A6940098B216"
BuildableName = "CogAudio.framework"
BlueprintName = "CogAudio"
BlueprintName = "CogAudio Framework"
ReferencedContainer = "container:CogAudio.xcodeproj">
</BuildableReference>
</BuildActionEntry>
@ -45,7 +45,7 @@
BuildableIdentifier = "primary"
BlueprintIdentifier = "8DC2EF4F0486A6940098B216"
BuildableName = "CogAudio.framework"
BlueprintName = "CogAudio"
BlueprintName = "CogAudio Framework"
ReferencedContainer = "container:CogAudio.xcodeproj">
</BuildableReference>
</MacroExpansion>
@ -61,7 +61,7 @@
BuildableIdentifier = "primary"
BlueprintIdentifier = "8DC2EF4F0486A6940098B216"
BuildableName = "CogAudio.framework"
BlueprintName = "CogAudio"
BlueprintName = "CogAudio Framework"
ReferencedContainer = "container:CogAudio.xcodeproj">
</BuildableReference>
</MacroExpansion>

View file

@ -79,9 +79,9 @@ static void *kCogDecoderMultiContext = &kCogDecoderMultiContext;
return @{};
}
- (AudioChunk *)readAudio {
if(theDecoder != nil) return [theDecoder readAudio];
return nil;
- (int)readAudio:(void *)buffer frames:(UInt32)frames {
if(theDecoder != nil) return [theDecoder readAudio:buffer frames:frames];
return 0;
}
- (BOOL)open:(id<CogSource>)source {

View file

@ -7,5 +7,5 @@
*
*/
double logarithmicToLinear(const double logarithmic, double MAX_VOLUME);
double linearToLogarithmic(const double linear, double MAX_VOLUME);
double logarithmicToLinear(double logarithmic, double MAX_VOLUME);
double linearToLogarithmic(double linear, double MAX_VOLUME);

View file

@ -13,13 +13,13 @@
// These functions are helpers for the process of converting volume from a linear to logarithmic scale.
// Numbers that goes in to audioPlayer should be logarithmic. Numbers that are displayed to the user should be linear.
// Here's why: http://www.dr-lex.34sp.com/info-stuff/volumecontrols.html
// We are using the approximation of X^2 when volume is limited to 100% and X^4 when volume is limited to 800%.
// We are using the approximation of X^4.
// Input/Output values are in percents.
double logarithmicToLinear(const double logarithmic, double MAX_VOLUME) {
return (MAX_VOLUME == 100.0) ? pow((logarithmic / MAX_VOLUME), 0.5) * 100.0 : pow((logarithmic / MAX_VOLUME), 0.25) * 100.0;
double logarithmicToLinear(double logarithmic, double MAX_VOLUME) {
return (MAX_VOLUME == 100.0) ? logarithmic : pow((logarithmic / MAX_VOLUME), 0.25) * 100.0;
}
double linearToLogarithmic(const double linear, double MAX_VOLUME) {
return (MAX_VOLUME == 100.0) ? (linear / 100.0) * (linear / 100.0) * MAX_VOLUME : (linear / 100.0) * (linear / 100.0) * (linear / 100.0) * (linear / 100.0) * MAX_VOLUME;
double linearToLogarithmic(double linear, double MAX_VOLUME) {
return (MAX_VOLUME == 100.0) ? linear : (linear / 100.0) * (linear / 100.0) * (linear / 100.0) * (linear / 100.0) * MAX_VOLUME;
}
// End helper volume function thingies. ONWARDS TO GLORY!

View file

@ -24,9 +24,7 @@ using std::atomic_long;
#import "Downmix.h"
#import <CogAudio/CogAudio-Swift.h>
#import "HeadphoneFilter.h"
#import "VisualizationController.h"
//#define OUTPUT_LOG
#ifdef OUTPUT_LOG
@ -35,17 +33,13 @@ using std::atomic_long;
@class OutputNode;
@class FSurroundFilter;
@interface OutputAVFoundation : NSObject {
OutputNode *outputController;
BOOL rsDone;
void *rsstate, *rsold;
BOOL r8bDone;
void *r8bstate, *r8bold;
double lastClippedSampleRate;
void *rsvis;
void *r8bvis;
double lastVisRate;
BOOL stopInvoked;
@ -61,8 +55,9 @@ using std::atomic_long;
BOOL eqEnabled;
BOOL eqInitialized;
BOOL dontRemix;
BOOL streamFormatStarted;
BOOL streamFormatChanged;
double secondsHdcdSustained;
@ -76,16 +71,13 @@ using std::atomic_long;
float eqPreamp;
AudioDeviceID outputDeviceID;
AudioStreamBasicDescription realStreamFormat; // stream format pre-hrtf
AudioStreamBasicDescription streamFormat; // stream format last seen in render callback
AudioStreamBasicDescription realNewFormat; // in case of resampler flush
AudioStreamBasicDescription newFormat; // in case of resampler flush
AudioStreamBasicDescription visFormat; // Mono format for vis
uint32_t realStreamChannelConfig;
uint32_t deviceChannelConfig;
uint32_t streamChannelConfig;
uint32_t realNewChannelConfig;
uint32_t newChannelConfig;
AVSampleBufferAudioRenderer *audioRenderer;
@ -93,6 +85,8 @@ using std::atomic_long;
CMAudioFormatDescriptionRef audioFormatDescription;
AudioChannelLayoutTag streamTag;
id currentPtsObserver;
NSLock *currentPtsLock;
CMTime currentPts, lastPts;
@ -109,28 +103,8 @@ using std::atomic_long;
VisualizationController *visController;
BOOL enableHrtf;
HeadphoneFilter *hrtf;
BOOL enableFSurround;
BOOL FSurroundDelayRemoved;
int inputBufferLastTime;
FSurroundFilter *fsurround;
BOOL resetStreamFormat;
BOOL shouldPlayOutBuffer;
float *samplePtr;
float tempBuffer[512 * 32];
float rsTempBuffer[4096 * 32];
float inputBuffer[4096 * 32]; // 4096 samples times maximum supported channel count
float fsurroundBuffer[8192 * 6];
float hrtfBuffer[4096 * 2];
float eqBuffer[4096 * 32];
float visAudio[4096];
float visTemp[8192];
float inputBuffer[2048 * 32]; // 2048 samples times maximum supported channel count
float eqBuffer[2048 * 32];
#ifdef OUTPUT_LOG
FILE *_logFile;
@ -153,8 +127,6 @@ using std::atomic_long;
- (void)setEqualizerEnabled:(BOOL)enabled;
- (void)setShouldPlayOutBuffer:(BOOL)enabled;
- (void)sustainHDCD;
@end

View file

@ -17,9 +17,7 @@
#import <Accelerate/Accelerate.h>
#import "rsstate.h"
#import "FSurroundFilter.h"
#import "r8bstate.h"
extern void scale_by_volume(float *buffer, size_t count, float volume);
@ -48,20 +46,24 @@ static void clearBuffers(AudioBufferList *ioData, size_t count, size_t offset) {
}
static OSStatus eqRenderCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData) {
if(inNumberFrames > 4096 || !inRefCon) {
if(inNumberFrames > 1024 || !inRefCon) {
clearBuffers(ioData, inNumberFrames, 0);
return 0;
}
OutputAVFoundation *_self = (__bridge OutputAVFoundation *)inRefCon;
fillBuffers(ioData, _self->samplePtr, inNumberFrames, 0);
fillBuffers(ioData, &_self->inputBuffer[0], inNumberFrames, 0);
return 0;
}
- (int)renderInput:(int)amountToRead toBuffer:(float *)buffer {
int amountRead = 0;
- (int)renderInput {
int amountToRead, amountRead = 0;
amountToRead = 1024;
float visAudio[amountToRead]; // Chunk size
if(stopping == YES || [outputController shouldContinue] == NO) {
// Chain is dead, fill out the serial number pointer forever with silence
@ -69,13 +71,17 @@ static OSStatus eqRenderCallback(void *inRefCon, AudioUnitRenderActionFlags *ioA
return 0;
}
AudioStreamBasicDescription format;
uint32_t config;
if([outputController peekFormat:&format channelConfig:&config]) {
AudioChunk *chunk = [outputController readChunk:1024];
int frameCount = (int)[chunk frameCount];
AudioStreamBasicDescription format = [chunk format];
uint32_t config = [chunk channelConfig];
double chunkDuration = 0;
if(frameCount) {
// XXX ERROR with AirPods - Can't go higher than CD*8 surround - 192k stereo
// Emits to console: [AUScotty] Initialize: invalid FFT size 16384
// DSD256 stereo emits: [AUScotty] Initialize: invalid FFT size 65536
BOOL formatClipped = NO;
BOOL isSurround = format.mChannelsPerFrame > 2;
const double maxSampleRate = isSurround ? 352800.0 : 192000.0;
@ -85,23 +91,19 @@ static OSStatus eqRenderCallback(void *inRefCon, AudioUnitRenderActionFlags *ioA
format.mSampleRate = maxSampleRate;
dstRate = maxSampleRate;
formatClipped = YES;
if(srcRate != lastClippedSampleRate) {
lastClippedSampleRate = srcRate;
streamFormatStarted = NO;
}
}
if(!streamFormatStarted || config != realStreamChannelConfig || memcmp(&newFormat, &format, sizeof(format)) != 0) {
if(!streamFormatStarted || config != streamChannelConfig || memcmp(&newFormat, &format, sizeof(format)) != 0) {
[currentPtsLock lock];
if(formatClipped) {
ALog(@"Sample rate clipped to no more than %f Hz!", maxSampleRate);
if(rsstate) {
rsold = rsstate;
rsstate = NULL;
if(r8bstate) {
r8bold = r8bstate;
r8bstate = NULL;
}
rsstate = rsstate_new(format.mChannelsPerFrame, srcRate, dstRate);
} else if(rsstate) {
rsold = rsstate;
rsstate = NULL;
r8bstate = r8bstate_new(format.mChannelsPerFrame, 1024, srcRate, dstRate);
} else if(r8bstate) {
r8bold = r8bstate;
r8bstate = NULL;
}
[currentPtsLock unlock];
newFormat = format;
@ -114,26 +116,13 @@ static OSStatus eqRenderCallback(void *inRefCon, AudioUnitRenderActionFlags *ioA
visFormat.mBytesPerPacket = visFormat.mBytesPerFrame * visFormat.mFramesPerPacket;
downmixerForVis = [[DownmixProcessor alloc] initWithInputFormat:format inputConfig:config andOutputFormat:visFormat outputConfig:AudioConfigMono];
if(!rsold) {
realStreamFormat = format;
realStreamChannelConfig = config;
streamFormatChanged = YES;
}
if(!r8bold) {
streamFormat = format;
streamChannelConfig = config;
[self updateStreamFormat];
}
}
if(streamFormatChanged) {
return 0;
}
AudioChunk *chunk = [outputController readChunk:amountToRead];
int frameCount = (int)[chunk frameCount];
format = [chunk format];
config = [chunk channelConfig];
double chunkDuration = 0;
if(frameCount) {
chunkDuration = [chunk duration];
NSData *samples = [chunk removeSamples:frameCount];
@ -143,15 +132,16 @@ static OSStatus eqRenderCallback(void *inRefCon, AudioUnitRenderActionFlags *ioA
location:@"pre downmix"];
#endif
// It should be fine to request up to double, we'll only get downsampled
float outputBuffer[2048 * newFormat.mChannelsPerFrame];
const float *outputPtr = (const float *)[samples bytes];
if(rsstate) {
if(r8bstate) {
size_t inDone = 0;
[currentPtsLock lock];
size_t framesDone = rsstate_resample(rsstate, outputPtr, frameCount, &inDone, &rsTempBuffer[0], amountToRead);
size_t framesDone = r8bstate_resample(r8bstate, outputPtr, frameCount, &inDone, &outputBuffer[0], 2048);
[currentPtsLock unlock];
if(!framesDone) return 0;
frameCount = (int)framesDone;
outputPtr = &rsTempBuffer[0];
outputPtr = &outputBuffer[0];
chunkDuration = frameCount / newFormat.mSampleRate;
}
@ -161,38 +151,39 @@ static OSStatus eqRenderCallback(void *inRefCon, AudioUnitRenderActionFlags *ioA
[visController postSampleRate:44100.0];
float visTemp[8192];
if(newFormat.mSampleRate != 44100.0) {
if(newFormat.mSampleRate != lastVisRate) {
if(rsvis) {
if(r8bvis) {
for(;;) {
int samplesFlushed;
[currentPtsLock lock];
samplesFlushed = (int)rsstate_flush(rsvis, &visTemp[0], 8192);
samplesFlushed = (int)r8bstate_flush(r8bvis, &visTemp[0], 8192);
[currentPtsLock unlock];
if(samplesFlushed > 1) {
if(samplesFlushed) {
[visController postVisPCM:visTemp amount:samplesFlushed];
} else {
break;
}
}
[currentPtsLock lock];
rsstate_delete(rsvis);
rsvis = NULL;
r8bstate_delete(r8bvis);
r8bvis = NULL;
[currentPtsLock unlock];
}
lastVisRate = newFormat.mSampleRate;
[currentPtsLock lock];
rsvis = rsstate_new(1, lastVisRate, 44100.0);
r8bvis = r8bstate_new(1, 1024, lastVisRate, 44100.0);
[currentPtsLock unlock];
}
if(rsvis) {
if(r8bvis) {
int samplesProcessed;
size_t totalDone = 0;
size_t inDone = 0;
size_t visFrameCount = frameCount;
{
[currentPtsLock lock];
samplesProcessed = (int)rsstate_resample(rsvis, &visAudio[totalDone], visFrameCount, &inDone, &visTemp[0], 8192);
samplesProcessed = (int)r8bstate_resample(r8bvis, &visAudio[totalDone], visFrameCount, &inDone, &visTemp[0], 8192);
[currentPtsLock unlock];
if(samplesProcessed) {
[visController postVisPCM:&visTemp[0] amount:samplesProcessed];
@ -201,28 +192,28 @@ static OSStatus eqRenderCallback(void *inRefCon, AudioUnitRenderActionFlags *ioA
visFrameCount -= inDone;
} while(samplesProcessed && visFrameCount);
}
} else if(rsvis) {
} else if(r8bvis) {
for(;;) {
int samplesFlushed;
[currentPtsLock lock];
samplesFlushed = (int)rsstate_flush(rsvis, &visTemp[0], 8192);
samplesFlushed = (int)r8bstate_flush(r8bvis, &visTemp[0], 8192);
[currentPtsLock unlock];
if(samplesFlushed > 1) {
if(samplesFlushed) {
[visController postVisPCM:visTemp amount:samplesFlushed];
} else {
break;
}
}
[currentPtsLock lock];
rsstate_delete(rsvis);
rsvis = NULL;
r8bstate_delete(r8bvis);
r8bvis = NULL;
[currentPtsLock unlock];
[visController postVisPCM:&visAudio[0] amount:frameCount];
} else {
[visController postVisPCM:&visAudio[0] amount:frameCount];
}
cblas_scopy((int)(frameCount * newFormat.mChannelsPerFrame), outputPtr, 1, &buffer[0], 1);
cblas_scopy((int)(frameCount * newFormat.mChannelsPerFrame), outputPtr, 1, &inputBuffer[0], 1);
amountRead = frameCount;
} else {
return 0;
@ -246,7 +237,7 @@ static OSStatus eqRenderCallback(void *inRefCon, AudioUnitRenderActionFlags *ioA
volumeScale *= eqPreamp;
}
scale_by_volume(&buffer[0], amountRead * newFormat.mChannelsPerFrame, volumeScale);
scale_by_volume(&inputBuffer[0], amountRead * newFormat.mChannelsPerFrame, volumeScale);
return amountRead;
}
@ -263,8 +254,7 @@ static OSStatus eqRenderCallback(void *inRefCon, AudioUnitRenderActionFlags *ioA
currentPtsLock = [[NSLock alloc] init];
#ifdef OUTPUT_LOG
NSString *logName = [NSTemporaryDirectory() stringByAppendingPathComponent:@"CogAudioLog.raw"];
_logFile = fopen([logName UTF8String], "wb");
_logFile = fopen("/tmp/CogAudioLog.raw", "wb");
#endif
}
@ -311,14 +301,8 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
} else if([keyPath isEqualToString:@"values.eqPreamp"]) {
float preamp = [[[NSUserDefaultsController sharedUserDefaultsController] defaults] floatForKey:@"eqPreamp"];
eqPreamp = pow(10.0, preamp / 20.0);
} else if([keyPath isEqualToString:@"values.enableHrtf"]) {
enableHrtf = [[[NSUserDefaultsController sharedUserDefaultsController] defaults] boolForKey:@"enableHrtf"];
if(streamFormatStarted)
resetStreamFormat = YES;
} else if([keyPath isEqualToString:@"values.enableFSurround"]) {
enableFSurround = [[[NSUserDefaultsController sharedUserDefaultsController] defaults] boolForKey:@"enableFSurround"];
if(streamFormatStarted)
resetStreamFormat = YES;
} else if([keyPath isEqualToString:@"values.dontRemix"]) {
dontRemix = [[[NSUserDefaultsController sharedUserDefaultsController] defaults] boolForKey:@"dontRemix"];
}
}
@ -346,7 +330,6 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
- (void)threadEntry:(id)arg {
running = YES;
started = NO;
shouldPlayOutBuffer = NO;
secondsLatency = 1.0;
while(!stopping) {
@ -377,9 +360,7 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
if(bufferRef) {
CMTime chunkDuration = CMSampleBufferGetDuration(bufferRef);
[currentPtsLock lock];
outputPts = CMTimeAdd(outputPts, chunkDuration);
[currentPtsLock unlock];
trackPts = CMTimeAdd(trackPts, chunkDuration);
[audioRenderer enqueueSampleBuffer:bufferRef];
@ -617,70 +598,41 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
- (void)updateStreamFormat {
/* Set the channel layout for the audio queue */
resetStreamFormat = NO;
uint32_t channels = realStreamFormat.mChannelsPerFrame;
uint32_t channelConfig = realStreamChannelConfig;
if(enableFSurround && channels == 2 && channelConfig == AudioConfigStereo) {
[currentPtsLock lock];
fsurround = [[FSurroundFilter alloc] initWithSampleRate:realStreamFormat.mSampleRate];
[currentPtsLock unlock];
channels = [fsurround channelCount];
channelConfig = [fsurround channelConfig];
FSurroundDelayRemoved = NO;
} else {
[currentPtsLock lock];
fsurround = nil;
[currentPtsLock unlock];
}
/* Apple's audio processor really only supports common 1-8 channel formats */
if(enableHrtf || channels > 8 || ((channelConfig & ~(AudioConfig6Point1|AudioConfig7Point1)) != 0)) {
NSURL *presetUrl = [[NSBundle mainBundle] URLForResource:@"SADIE_D02-96000" withExtension:@"mhr"];
hrtf = [[HeadphoneFilter alloc] initWithImpulseFile:presetUrl forSampleRate:realStreamFormat.mSampleRate withInputChannels:channels withConfig:channelConfig];
channels = 2;
channelConfig = AudioChannelSideLeft | AudioChannelSideRight;
} else {
hrtf = nil;
}
streamFormat = realStreamFormat;
streamFormat.mChannelsPerFrame = channels;
streamFormat.mBytesPerFrame = sizeof(float) * channels;
streamFormat.mFramesPerPacket = 1;
streamFormat.mBytesPerPacket = sizeof(float) * channels;
streamChannelConfig = channelConfig;
AudioChannelLayoutTag tag = 0;
AudioChannelLayout layout = { 0 };
switch(streamChannelConfig) {
case AudioConfigMono:
tag = kAudioChannelLayoutTag_Mono;
deviceChannelConfig = AudioConfigMono;
break;
case AudioConfigStereo:
tag = kAudioChannelLayoutTag_Stereo;
deviceChannelConfig = AudioConfigStereo;
break;
case AudioConfig3Point0:
tag = kAudioChannelLayoutTag_WAVE_3_0;
deviceChannelConfig = AudioConfig3Point0;
break;
case AudioConfig4Point0:
tag = kAudioChannelLayoutTag_WAVE_4_0_A;
deviceChannelConfig = AudioConfig4Point0;
break;
case AudioConfig5Point0:
tag = kAudioChannelLayoutTag_WAVE_5_0_A;
deviceChannelConfig = AudioConfig5Point0;
break;
case AudioConfig5Point1:
tag = kAudioChannelLayoutTag_WAVE_5_1_A;
deviceChannelConfig = AudioConfig5Point1;
break;
case AudioConfig6Point1:
tag = kAudioChannelLayoutTag_WAVE_6_1;
deviceChannelConfig = AudioConfig6Point1;
break;
case AudioConfig7Point1:
tag = kAudioChannelLayoutTag_WAVE_7_1;
deviceChannelConfig = AudioConfig7Point1;
break;
default:
@ -688,6 +640,8 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
break;
}
streamTag = tag;
if(tag) {
layout.mChannelLayoutTag = tag;
} else {
@ -704,20 +658,11 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
return;
}
if(eqInitialized) {
AudioUnitUninitialize(_eq);
eqInitialized = NO;
}
AudioStreamBasicDescription asbd = streamFormat;
// Of course, non-interleaved has only one sample per frame/packet, per buffer
asbd.mFormatFlags |= kAudioFormatFlagIsNonInterleaved;
asbd.mBytesPerFrame = sizeof(float);
asbd.mBytesPerPacket = sizeof(float);
asbd.mFramesPerPacket = 1;
asbd.mFormatFlags &= ~kAudioFormatFlagIsPacked;
UInt32 maximumFrames = 4096;
UInt32 maximumFrames = 1024;
AudioUnitSetProperty(_eq, kAudioUnitProperty_MaximumFramesPerSlice, kAudioUnitScope_Global, 0, &maximumFrames, sizeof(maximumFrames));
AudioUnitSetProperty(_eq, kAudioUnitProperty_StreamFormat,
@ -730,12 +675,6 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
AudioUnitReset(_eq, kAudioUnitScope_Global, 0);
if(AudioUnitInitialize(_eq) != noErr) {
eqEnabled = NO;
return;
}
eqInitialized = YES;
eqEnabled = [[[[NSUserDefaultsController sharedUserDefaultsController] defaults] objectForKey:@"GraphicEQenable"] boolValue];
}
@ -748,10 +687,11 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
CMSampleBufferRef sampleBuffer = nil;
OSStatus err = CMAudioSampleBufferCreateReadyWithPacketDescriptions(kCFAllocatorDefault, blockBuffer, audioFormatDescription, samplesRendered, outputPts, nil, &sampleBuffer);
CFRelease(blockBuffer);
if(err != noErr) {
CFRelease(blockBuffer);
return nil;
}
CFRelease(blockBuffer);
return sampleBuffer;
}
@ -763,101 +703,47 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
status = CMBlockBufferCreateEmpty(kCFAllocatorDefault, 0, 0, &blockListBuffer);
if(status != noErr || !blockListBuffer) return 0;
if(resetStreamFormat || streamFormatChanged) {
streamFormatChanged = NO;
[self updateStreamFormat];
}
int inputRendered = inputBufferLastTime;
int bytesRendered = inputRendered * newFormat.mBytesPerPacket;
while(inputRendered < 4096) {
int maxToRender = MIN(4096 - inputRendered, 512);
int rendered = [self renderInput:maxToRender toBuffer:&tempBuffer[0]];
if(rendered > 0) {
memcpy((((uint8_t *)inputBuffer) + bytesRendered), &tempBuffer[0], rendered * newFormat.mBytesPerPacket);
}
inputRendered += rendered;
bytesRendered += rendered * newFormat.mBytesPerPacket;
if(streamFormatChanged) {
streamFormatChanged = NO;
if(inputRendered) {
resetStreamFormat = YES;
break;
} else {
[self updateStreamFormat];
}
}
int inputRendered;
do {
inputRendered = [self renderInput];
if([self processEndOfStream]) break;
}
} while(!inputRendered);
inputBufferLastTime = inputRendered;
float tempBuffer[2048 * 32];
int samplesRenderedTotal = 0;
for(size_t i = 0; i < 2;) {
float *samplePtr;
int samplesRendered;
if(i == 0) {
if(!rsold) {
if(!r8bold) {
++i;
continue;
}
[currentPtsLock lock];
samplesRendered = rsstate_flush(rsold, &rsTempBuffer[0], 4096);
samplesRendered = r8bstate_flush(r8bold, &tempBuffer[0], 2048);
[currentPtsLock unlock];
if(samplesRendered < 4096) {
rsstate_delete(rsold);
rsold = NULL;
rsDone = YES;
++i;
continue;
if(!samplesRendered) {
r8bstate_delete(r8bold);
r8bold = NULL;
r8bDone = YES;
}
samplePtr = &rsTempBuffer[0];
samplePtr = &tempBuffer[0];
} else {
samplesRendered = inputRendered;
samplePtr = &inputBuffer[0];
if(rsDone) {
rsDone = NO;
realStreamFormat = newFormat;
realStreamChannelConfig = newChannelConfig;
if(r8bDone) {
r8bDone = NO;
streamFormat = newFormat;
streamChannelConfig = newChannelConfig;
[self updateStreamFormat];
}
}
if(samplesRendered || fsurround) {
if(fsurround) {
int countToProcess = samplesRendered;
if(countToProcess < 4096) {
bzero(samplePtr + countToProcess * 2, (4096 - countToProcess) * 2 * sizeof(float));
countToProcess = 4096;
}
[fsurround process:samplePtr output:&fsurroundBuffer[0] count:countToProcess];
samplePtr = &fsurroundBuffer[0];
if(resetStreamFormat || samplesRendered < 4096) {
bzero(&fsurroundBuffer[4096 * 6], 4096 * 2 * sizeof(float));
[fsurround process:&fsurroundBuffer[4096 * 6] output:&fsurroundBuffer[4096 * 6] count:4096];
samplesRendered += 2048;
}
if(!FSurroundDelayRemoved) {
FSurroundDelayRemoved = YES;
if(samplesRendered > 2048) {
samplePtr += 2048 * 6;
samplesRendered -= 2048;
}
}
}
if(!samplesRendered) {
break;
}
if(hrtf) {
[hrtf process:samplePtr sampleCount:samplesRendered toBuffer:&hrtfBuffer[0]];
samplePtr = &hrtfBuffer[0];
}
if(eqEnabled && eqInitialized) {
if(samplesRendered) {
if(eqEnabled) {
const int channels = streamFormat.mChannelsPerFrame;
if(channels > 0) {
const size_t channelsminusone = channels - 1;
@ -866,8 +752,8 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
ioData->mNumberBuffers = channels;
for(size_t i = 0; i < channels; ++i) {
ioData->mBuffers[i].mData = &eqBuffer[4096 * i];
ioData->mBuffers[i].mDataByteSize = samplesRendered * sizeof(float);
ioData->mBuffers[i].mData = &eqBuffer[1024 * i];
ioData->mBuffers[i].mDataByteSize = 1024 * sizeof(float);
ioData->mBuffers[i].mNumberChannels = 1;
}
@ -881,7 +767,7 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
timeStamp.mSampleTime += ((double)samplesRendered) / streamFormat.mSampleRate;
for(int i = 0; i < channels; ++i) {
cblas_scopy(samplesRendered, &eqBuffer[4096 * i], 1, samplePtr + i, channels);
cblas_scopy(samplesRendered, &eqBuffer[1024 * i], 1, samplePtr, channels);
}
}
}
@ -889,10 +775,6 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
CMBlockBufferRef blockBuffer = nil;
size_t dataByteSize = samplesRendered * sizeof(float) * streamFormat.mChannelsPerFrame;
#ifdef OUTPUT_LOG
fwrite(samplePtr, 1, dataByteSize, _logFile);
#endif
status = CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, nil, dataByteSize, kCFAllocatorDefault, nil, 0, dataByteSize, kCMBlockBufferAssureMemoryNowFlag, &blockBuffer);
if(status != noErr || !blockBuffer) {
@ -920,15 +802,20 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
}
if(i == 0) {
samplesRenderedTotal += samplesRendered;
if(samplesRendered) {
break;
if(!samplesRendered) {
*blockBufferOut = blockListBuffer;
return samplesRenderedTotal + samplesRendered;
}
} else {
samplesRenderedTotal += samplesRendered;
if(!samplesRendered || samplesRenderedTotal >= 1024) {
++i;
} else {
inputBufferLastTime = 0;
samplesRenderedTotal += samplesRendered;
++i;
do {
inputRendered = [self renderInput];
if([self processEndOfStream]) break;
} while(!inputRendered);
}
}
}
@ -945,15 +832,9 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
stopInvoked = NO;
stopCompleted = NO;
commandStop = NO;
shouldPlayOutBuffer = NO;
audioFormatDescription = NULL;
resetStreamFormat = NO;
streamFormatChanged = NO;
inputBufferLastTime = 0;
running = NO;
stopping = NO;
stopped = NO;
@ -963,13 +844,11 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
downmixerForVis = nil;
rsDone = NO;
rsstate = NULL;
rsold = NULL;
r8bDone = NO;
r8bstate = NULL;
r8bold = NULL;
lastClippedSampleRate = 0.0;
rsvis = NULL;
r8bvis = NULL;
lastVisRate = 44100.0;
AudioComponentDescription desc;
@ -1055,8 +934,7 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.outputDevice" options:0 context:kOutputAVFoundationContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.GraphicEQenable" options:0 context:kOutputAVFoundationContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.eqPreamp" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kOutputAVFoundationContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.enableHrtf" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kOutputAVFoundationContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.enableFSurround" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kOutputAVFoundationContext];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.dontRemix" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew) context:kOutputAVFoundationContext];
observersapplied = YES;
[renderSynchronizer addRenderer:audioRenderer];
@ -1085,10 +963,9 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
OutputNode *outputController = self->outputController;
double *latencySecondsOut = &self->secondsLatency;
VisualizationController *visController = self->visController;
void **rsstate = &self->rsstate;
void **rsold = &self->rsold;
void **rsvis = &self->rsvis;
FSurroundFilter *const *fsurroundtest = &self->fsurround;
void **r8bstate = &self->r8bstate;
void **r8bold = &self->r8bold;
void **r8bvis = &self->r8bvis;
currentPtsObserver = [renderSynchronizer addPeriodicTimeObserverForInterval:interval
queue:NULL
usingBlock:^(CMTime time) {
@ -1101,23 +978,19 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
if(timeAdded > 0) {
[outputController incrementAmountPlayed:timeAdded];
}
[lock lock];
CMTime latencyTime = CMTimeSubtract(*outputPts, time);
[lock unlock];
double latencySeconds = CMTimeGetSeconds(latencyTime);
double latencyVis = 0.0;
[lock lock];
if(*rsstate) {
latencySeconds += rsstate_latency(*rsstate);
if(*r8bstate) {
latencySeconds += r8bstate_latency(*r8bstate);
}
if(*rsold) {
latencySeconds += rsstate_latency(*rsold);
if(*r8bold) {
latencySeconds += r8bstate_latency(*r8bold);
}
if(*rsvis) {
latencyVis = rsstate_latency(*rsvis);
}
if(*fsurroundtest) {
latencyVis += 2048.0 / [(*fsurroundtest) srate];
if(*r8bvis) {
latencyVis = r8bstate_latency(*r8bvis);
}
[lock unlock];
if(latencySeconds < 0)
@ -1136,10 +1009,7 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
}
- (void)removeSynchronizerBlock {
if(renderSynchronizer && currentPtsObserver) {
[renderSynchronizer removeTimeObserver:currentPtsObserver];
currentPtsObserver = nil;
}
}
- (void)setVolume:(double)v {
@ -1185,8 +1055,7 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.outputDevice" context:kOutputAVFoundationContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.GraphicEQenable" context:kOutputAVFoundationContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.eqPreamp" context:kOutputAVFoundationContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.enableHrtf" context:kOutputAVFoundationContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.enableFSurround" context:kOutputAVFoundationContext];
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.dontRemix" context:kOutputAVFoundationContext];
observersapplied = NO;
}
stopping = YES;
@ -1216,7 +1085,7 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
}
if(renderSynchronizer || audioRenderer) {
if(renderSynchronizer) {
if(shouldPlayOutBuffer && !commandStop) {
if(!commandStop) {
int compareVal = 0;
double secondsLatency = self->secondsLatency >= 0 ? self->secondsLatency : 0;
int compareMax = (((1000000 / 5000) * secondsLatency) + (10000 / 5000)); // latency plus 10ms, divide by sleep intervals
@ -1229,13 +1098,6 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
}
[self removeSynchronizerBlock];
[renderSynchronizer setRate:0];
if(audioRenderer) {
[renderSynchronizer removeRenderer:audioRenderer atTime:kCMTimeZero completionHandler:^(BOOL didRemoveRenderer) {
if(!didRemoveRenderer) {
DLog(@"Error removing renderer!");
}
}];
}
}
if(audioRenderer) {
[audioRenderer stopRequestingMediaData];
@ -1274,17 +1136,17 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
#endif
outputController = nil;
visController = nil;
if(rsstate) {
rsstate_delete(rsstate);
rsstate = NULL;
if(r8bstate) {
r8bstate_delete(r8bstate);
r8bstate = NULL;
}
if(rsold) {
rsstate_delete(rsold);
rsold = NULL;
if(r8bold) {
r8bstate_delete(r8bold);
r8bold = NULL;
}
if(rsvis) {
rsstate_delete(rsvis);
rsvis = NULL;
if(r8bvis) {
r8bstate_delete(r8bvis);
r8bvis = NULL;
}
stopCompleted = YES;
}
@ -1310,8 +1172,4 @@ current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, cons
secondsHdcdSustained = 10.0;
}
- (void)setShouldPlayOutBuffer:(BOOL)s {
shouldPlayOutBuffer = s;
}
@end

View file

@ -1,132 +0,0 @@
//
// OutputCoreAudio.h
// Cog
//
// Created by Christopher Snowhill on 7/25/23.
// Copyright 2023-2024 Christopher Snowhill. All rights reserved.
//
#import <AssertMacros.h>
#import <Cocoa/Cocoa.h>
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>
#import <AudioUnit/AudioUnit.h>
#import <CoreAudio/AudioHardware.h>
#import <CoreAudio/CoreAudioTypes.h>
#ifdef __cplusplus
#import <atomic>
using std::atomic_long;
#else
#import <stdatomic.h>
#endif
#import <simd/simd.h>
#import <CogAudio/ChunkList.h>
#import <CogAudio/HeadphoneFilter.h>
//#define OUTPUT_LOG
@class OutputNode;
@class AudioChunk;
@interface OutputCoreAudio : NSObject {
OutputNode *outputController;
dispatch_semaphore_t writeSemaphore;
dispatch_semaphore_t readSemaphore;
NSLock *outputLock;
double streamTimestamp;
BOOL stopInvoked;
BOOL stopCompleted;
BOOL running;
BOOL stopping;
BOOL stopped;
BOOL started;
BOOL paused;
BOOL restarted;
BOOL commandStop;
BOOL resetting;
BOOL cutOffInput;
BOOL fading, faded;
float fadeLevel;
float fadeStep;
float fadeTarget;
BOOL eqEnabled;
BOOL eqInitialized;
BOOL streamFormatStarted;
BOOL streamFormatChanged;
double secondsHdcdSustained;
BOOL defaultdevicelistenerapplied;
BOOL currentdevicelistenerapplied;
BOOL devicealivelistenerapplied;
BOOL observersapplied;
BOOL outputdevicechanged;
float volume;
float eqPreamp;
AVAudioFormat *_deviceFormat;
AudioDeviceID outputDeviceID;
AudioStreamBasicDescription deviceFormat;
AudioStreamBasicDescription realStreamFormat; // stream format pre-hrtf
AudioStreamBasicDescription streamFormat; // stream format last seen in render callback
uint32_t deviceChannelConfig;
uint32_t realStreamChannelConfig;
uint32_t streamChannelConfig;
AUAudioUnit *_au;
size_t _bufferSize;
BOOL resetStreamFormat;
BOOL shouldPlayOutBuffer;
ChunkList *outputBuffer;
#ifdef OUTPUT_LOG
NSFileHandle *_logFile;
#endif
}
- (id)initWithController:(OutputNode *)c;
- (BOOL)setup;
- (OSStatus)setOutputDeviceByID:(int)deviceID;
- (BOOL)setOutputDeviceWithDeviceDict:(NSDictionary *)deviceDict;
- (void)start;
- (void)pause;
- (void)resume;
- (void)stop;
- (void)fadeOut;
- (void)fadeOutBackground;
- (void)fadeIn;
- (double)latency;
- (double)volume;
- (void)setVolume:(double)v;
- (void)setShouldPlayOutBuffer:(BOOL)enabled;
- (void)sustainHDCD;
- (AudioStreamBasicDescription)deviceFormat;
- (uint32_t)deviceChannelConfig;
@end

View file

@ -1,996 +0,0 @@
//
// OutputCoreAudio.m
// Cog
//
// Created by Christopher Snowhill on 7/25/23.
// Copyright 2023-2024 Christopher Snowhill. All rights reserved.
//
#import "OutputCoreAudio.h"
#import "OutputNode.h"
#ifdef _DEBUG
#import "BadSampleCleaner.h"
#endif
#import "Logging.h"
#import <Accelerate/Accelerate.h>
#import <CogAudio/VisualizationController.h>
#ifdef OUTPUT_LOG
#import <NSFileHandle+CreateFile.h>
#endif
extern void scale_by_volume(float *buffer, size_t count, float volume);
static NSString *CogPlaybackDidBeginNotificiation = @"CogPlaybackDidBeginNotificiation";
static BOOL fadeAudio(const float *inSamples, float *outSamples, size_t channels, size_t count, float *fadeLevel, float fadeStep, float fadeTarget) {
float _fadeLevel = *fadeLevel;
BOOL towardZero = fadeStep < 0.0;
BOOL stopping = NO;
size_t maxCount = (size_t)floor(fabs(fadeTarget - _fadeLevel) / fabs(fadeStep));
if(maxCount) {
size_t countToDo = MIN(count, maxCount);
for(size_t i = 0; i < channels; ++i) {
_fadeLevel = *fadeLevel;
vDSP_vrampmuladd(&inSamples[i], channels, &_fadeLevel, &fadeStep, &outSamples[i], channels, countToDo);
}
}
if(maxCount <= count) {
if(!towardZero && maxCount < count) {
vDSP_vadd(&inSamples[maxCount * channels], 1, &outSamples[maxCount * channels], 1, &outSamples[maxCount * channels], 1, (count - maxCount) * channels);
}
stopping = YES;
}
*fadeLevel = _fadeLevel;
return stopping;
}
@interface FadedBuffer : NSObject {
float fadeLevel;
float fadeStep;
float fadeTarget;
ChunkList *lastBuffer;
}
- (id)initWithBuffer:(ChunkList *)buffer fadeTarget:(float)fadeTarget sampleRate:(double)sampleRate;
- (BOOL)mix:(float *)outputBuffer sampleCount:(size_t)samples channelCount:(size_t)channels;
@end
@implementation FadedBuffer
- (id)initWithBuffer:(ChunkList *)buffer fadeTarget:(float)fadeTarget sampleRate:(double)sampleRate {
self = [super init];
if(self) {
fadeLevel = 1.0;
self->fadeTarget = fadeTarget;
lastBuffer = buffer;
const double maxFadeDurationMS = 1000.0 * [buffer listDuration];
const double fadeDuration = MIN(125.0f, maxFadeDurationMS);
fadeStep = ((fadeTarget - fadeLevel) / sampleRate) * (1000.0f / fadeDuration);
}
return self;
}
- (BOOL)mix:(float *)outputBuffer sampleCount:(size_t)samples channelCount:(size_t)channels {
if(lastBuffer) {
AudioChunk * chunk = [lastBuffer removeAndMergeSamples:samples callBlock:^BOOL{
// Always interrupt if buffer runs empty, because it is not being refilled any more
return true;
}];
if(chunk && [chunk frameCount]) {
// Will always be input request size or less
size_t samplesToMix = [chunk frameCount];
NSData *sampleData = [chunk removeSamples:samplesToMix];
return fadeAudio((const float *)[sampleData bytes], outputBuffer, channels, samplesToMix, &fadeLevel, fadeStep, fadeTarget);
}
}
// No buffer or no chunk, stream ended
return true;
}
@end
@implementation OutputCoreAudio {
VisualizationController *visController;
NSLock *fadedBuffersLock;
NSMutableArray<FadedBuffer *> *fadedBuffers;
}
static void *kOutputCoreAudioContext = &kOutputCoreAudioContext;
- (AudioChunk *)renderInput:(int)amountToRead {
if(stopping == YES || [outputController shouldContinue] == NO) {
// Chain is dead, fill out the serial number pointer forever with silence
stopping = YES;
return [[AudioChunk alloc] init];
}
AudioStreamBasicDescription format;
uint32_t config;
if([outputController peekFormat:&format channelConfig:&config]) {
if(!streamFormatStarted || config != realStreamChannelConfig || memcmp(&realStreamFormat, &format, sizeof(format)) != 0) {
realStreamFormat = format;
realStreamChannelConfig = config;
streamFormatStarted = YES;
streamFormatChanged = YES;
}
}
if(streamFormatChanged) {
return [[AudioChunk alloc] init];
}
return [outputController readChunk:amountToRead];
}
- (id)initWithController:(OutputNode *)c {
self = [super init];
if(self) {
outputController = c;
volume = 1.0;
outputDeviceID = -1;
secondsHdcdSustained = 0;
outputLock = [[NSLock alloc] init];
fadedBuffersLock = [[NSLock alloc] init];
fadedBuffers = [[NSMutableArray alloc] init];
#ifdef OUTPUT_LOG
NSString *logName = [NSTemporaryDirectory() stringByAppendingPathComponent:@"CogAudioLog.raw"];
_logFile = [NSFileHandle fileHandleForWritingAtPath:logName createFile:YES];
#endif
}
return self;
}
static OSStatus
default_device_changed(AudioObjectID inObjectID, UInt32 inNumberAddresses, const AudioObjectPropertyAddress *inAddresses, void *inUserData) {
OutputCoreAudio *_self = (__bridge OutputCoreAudio *)inUserData;
return [_self setOutputDeviceByID:-1];
}
static OSStatus
current_device_listener(AudioObjectID inObjectID, UInt32 inNumberAddresses, const AudioObjectPropertyAddress *inAddresses, void *inUserData) {
OutputCoreAudio *_self = (__bridge OutputCoreAudio *)inUserData;
for(UInt32 i = 0; i < inNumberAddresses; ++i) {
switch(inAddresses[i].mSelector) {
case kAudioDevicePropertyDeviceIsAlive:
return [_self setOutputDeviceByID:-1];
case kAudioDevicePropertyNominalSampleRate:
case kAudioDevicePropertyStreamFormat:
_self->outputdevicechanged = YES;
return noErr;
}
}
return noErr;
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if(context != kOutputCoreAudioContext) {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
return;
}
if([keyPath isEqualToString:@"values.outputDevice"]) {
NSDictionary *device = [[[NSUserDefaultsController sharedUserDefaultsController] defaults] objectForKey:@"outputDevice"];
[self setOutputDeviceWithDeviceDict:device];
}
}
- (BOOL)signalEndOfStream:(double)latency {
stopped = YES;
BOOL ret = [outputController selectNextBuffer];
stopped = ret;
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(NSEC_PER_SEC * latency)), dispatch_get_main_queue(), ^{
[self->outputController endOfInputPlayed];
[self->outputController resetAmountPlayed];
});
return ret;
}
- (BOOL)processEndOfStream {
if(stopping || ([outputController endOfStream] == YES && [self signalEndOfStream:[outputController getTotalLatency]])) {
stopping = YES;
return YES;
}
return NO;
}
- (void)threadEntry:(id)arg {
@autoreleasepool {
NSThread *currentThread = [NSThread currentThread];
[currentThread setThreadPriority:0.75];
[currentThread setQualityOfService:NSQualityOfServiceUserInitiated];
}
running = YES;
started = NO;
shouldPlayOutBuffer = NO;
BOOL rendered = NO;
while(!stopping) {
@autoreleasepool {
if(outputdevicechanged) {
[self updateDeviceFormat];
outputdevicechanged = NO;
}
if([outputController shouldReset]) {
[outputController setShouldReset:NO];
[outputLock lock];
started = NO;
restarted = NO;
[outputBuffer reset];
[outputLock unlock];
}
if(stopping)
break;
if(!cutOffInput && ![outputBuffer isFull]) {
[self renderAndConvert];
rendered = YES;
} else {
rendered = NO;
}
if(!started && !paused) {
// Prevent this call from hanging when used in this thread, when buffer may be empty
// and waiting for this very thread to fill it
resetting = YES;
[self resume];
resetting = NO;
}
if([outputController shouldContinue] == NO) {
break;
}
}
if(!rendered) {
usleep(5000);
}
}
stopped = YES;
if(!stopInvoked) {
[self doStop];
}
}
- (OSStatus)setOutputDeviceByID:(int)deviceIDIn {
OSStatus err;
BOOL defaultDevice = NO;
AudioObjectPropertyAddress theAddress = {
.mSelector = kAudioHardwarePropertyDefaultOutputDevice,
.mScope = kAudioObjectPropertyScopeGlobal,
.mElement = kAudioObjectPropertyElementMaster
};
AudioDeviceID deviceID = (AudioDeviceID)deviceIDIn;
if(deviceIDIn == -1) {
defaultDevice = YES;
UInt32 size = sizeof(AudioDeviceID);
err = AudioObjectGetPropertyData(kAudioObjectSystemObject, &theAddress, 0, NULL, &size, &deviceID);
if(err != noErr) {
DLog(@"THERE'S NO DEFAULT OUTPUT DEVICE");
return err;
}
}
if(_au) {
if(defaultdevicelistenerapplied && !defaultDevice) {
/* Already set above
* theAddress.mSelector = kAudioHardwarePropertyDefaultOutputDevice; */
AudioObjectRemovePropertyListener(kAudioObjectSystemObject, &theAddress, default_device_changed, (__bridge void *_Nullable)(self));
defaultdevicelistenerapplied = NO;
}
outputdevicechanged = NO;
if(outputDeviceID != deviceID) {
if(currentdevicelistenerapplied) {
if(devicealivelistenerapplied) {
theAddress.mSelector = kAudioDevicePropertyDeviceIsAlive;
AudioObjectRemovePropertyListener(outputDeviceID, &theAddress, current_device_listener, (__bridge void *_Nullable)(self));
devicealivelistenerapplied = NO;
}
theAddress.mSelector = kAudioDevicePropertyStreamFormat;
AudioObjectRemovePropertyListener(outputDeviceID, &theAddress, current_device_listener, (__bridge void *_Nullable)(self));
theAddress.mSelector = kAudioDevicePropertyNominalSampleRate;
AudioObjectRemovePropertyListener(outputDeviceID, &theAddress, current_device_listener, (__bridge void *_Nullable)(self));
currentdevicelistenerapplied = NO;
}
DLog(@"Device: %i\n", deviceID);
outputDeviceID = deviceID;
NSError *nserr;
[_au setDeviceID:outputDeviceID error:&nserr];
if(nserr != nil) {
return (OSErr)[nserr code];
}
outputdevicechanged = YES;
}
if(!currentdevicelistenerapplied) {
if(!devicealivelistenerapplied && !defaultDevice) {
theAddress.mSelector = kAudioDevicePropertyDeviceIsAlive;
AudioObjectAddPropertyListener(outputDeviceID, &theAddress, current_device_listener, (__bridge void *_Nullable)(self));
devicealivelistenerapplied = YES;
}
theAddress.mSelector = kAudioDevicePropertyStreamFormat;
AudioObjectAddPropertyListener(outputDeviceID, &theAddress, current_device_listener, (__bridge void *_Nullable)(self));
theAddress.mSelector = kAudioDevicePropertyNominalSampleRate;
AudioObjectAddPropertyListener(outputDeviceID, &theAddress, current_device_listener, (__bridge void *_Nullable)(self));
currentdevicelistenerapplied = YES;
}
if(!defaultdevicelistenerapplied && defaultDevice) {
theAddress.mSelector = kAudioHardwarePropertyDefaultOutputDevice;
AudioObjectAddPropertyListener(kAudioObjectSystemObject, &theAddress, default_device_changed, (__bridge void *_Nullable)(self));
defaultdevicelistenerapplied = YES;
}
} else {
err = noErr;
}
if(err != noErr) {
DLog(@"No output device with ID %d could be found.", deviceID);
return err;
}
return err;
}
- (BOOL)setOutputDeviceWithDeviceDict:(NSDictionary *)deviceDict {
NSNumber *deviceIDNum = deviceDict ? [deviceDict objectForKey:@"deviceID"] : @(-1);
int outputDeviceID = deviceIDNum ? [deviceIDNum intValue] : -1;
__block OSStatus err = [self setOutputDeviceByID:outputDeviceID];
if(err != noErr) {
// Try matching by name.
NSString *userDeviceName = deviceDict[@"name"];
[self enumerateAudioOutputsUsingBlock:
^(NSString *deviceName, AudioDeviceID deviceID, AudioDeviceID systemDefaultID, BOOL *stop) {
if([deviceName isEqualToString:userDeviceName]) {
err = [self setOutputDeviceByID:deviceID];
#if 0
// Disable. Would cause loop by triggering `-observeValueForKeyPath:ofObject:change:context:` above.
// Update `outputDevice`, in case the ID has changed.
NSDictionary *deviceInfo = @{
@"name": deviceName,
@"deviceID": @(deviceID),
};
[[NSUserDefaults standardUserDefaults] setObject:deviceInfo forKey:@"outputDevice"];
#endif
DLog(@"Found output device: \"%@\" (%d).", deviceName, deviceID);
*stop = YES;
}
}];
}
if(err != noErr) {
ALog(@"No output device could be found, your random error code is %d. Have a nice day!", err);
return NO;
}
return YES;
}
// The following is largely a copy pasta of -awakeFromNib from "OutputsArrayController.m".
// TODO: Share the code. (How to do this across xcodeproj?)
- (void)enumerateAudioOutputsUsingBlock:(void(NS_NOESCAPE ^ _Nonnull)(NSString *deviceName, AudioDeviceID deviceID, AudioDeviceID systemDefaultID, BOOL *stop))block {
UInt32 propsize;
AudioObjectPropertyAddress theAddress = {
.mSelector = kAudioHardwarePropertyDevices,
.mScope = kAudioObjectPropertyScopeGlobal,
.mElement = kAudioObjectPropertyElementMaster
};
__Verify_noErr(AudioObjectGetPropertyDataSize(kAudioObjectSystemObject, &theAddress, 0, NULL, &propsize));
UInt32 nDevices = propsize / (UInt32)sizeof(AudioDeviceID);
AudioDeviceID *devids = (AudioDeviceID *)malloc(propsize);
__Verify_noErr(AudioObjectGetPropertyData(kAudioObjectSystemObject, &theAddress, 0, NULL, &propsize, devids));
theAddress.mSelector = kAudioHardwarePropertyDefaultOutputDevice;
AudioDeviceID systemDefault;
propsize = sizeof(systemDefault);
__Verify_noErr(AudioObjectGetPropertyData(kAudioObjectSystemObject, &theAddress, 0, NULL, &propsize, &systemDefault));
theAddress.mScope = kAudioDevicePropertyScopeOutput;
for(UInt32 i = 0; i < nDevices; ++i) {
UInt32 isAlive = 0;
propsize = sizeof(isAlive);
theAddress.mSelector = kAudioDevicePropertyDeviceIsAlive;
__Verify_noErr(AudioObjectGetPropertyData(devids[i], &theAddress, 0, NULL, &propsize, &isAlive));
if(!isAlive) continue;
CFStringRef name = NULL;
propsize = sizeof(name);
theAddress.mSelector = kAudioDevicePropertyDeviceNameCFString;
__Verify_noErr(AudioObjectGetPropertyData(devids[i], &theAddress, 0, NULL, &propsize, &name));
propsize = 0;
theAddress.mSelector = kAudioDevicePropertyStreamConfiguration;
__Verify_noErr(AudioObjectGetPropertyDataSize(devids[i], &theAddress, 0, NULL, &propsize));
if(propsize < sizeof(UInt32)) {
if(name) CFRelease(name);
continue;
}
AudioBufferList *bufferList = (AudioBufferList *)malloc(propsize);
__Verify_noErr(AudioObjectGetPropertyData(devids[i], &theAddress, 0, NULL, &propsize, bufferList));
UInt32 bufferCount = bufferList->mNumberBuffers;
free(bufferList);
if(!bufferCount) {
if(name) CFRelease(name);
continue;
}
BOOL stop = NO;
NSString *deviceName = name ? [NSString stringWithString:(__bridge NSString *)name] : [NSString stringWithFormat:@"Unknown device %u", (unsigned int)devids[i]];
block(deviceName,
devids[i],
systemDefault,
&stop);
CFRelease(name);
if(stop) {
break;
}
}
free(devids);
}
- (BOOL)updateDeviceFormat {
AVAudioFormat *format = _au.outputBusses[0].format;
if(!_deviceFormat || ![_deviceFormat isEqual:format]) {
NSError *err;
AVAudioFormat *renderFormat;
_deviceFormat = format;
deviceFormat = *(format.streamDescription);
/// Seems some 3rd party devices return incorrect stuff...or I just don't like noninterleaved data.
deviceFormat.mFormatFlags &= ~kLinearPCMFormatFlagIsNonInterleaved;
// deviceFormat.mFormatFlags &= ~kLinearPCMFormatFlagIsFloat;
// deviceFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger;
// We don't want more than 8 channels
if(deviceFormat.mChannelsPerFrame > 8) {
deviceFormat.mChannelsPerFrame = 8;
}
deviceFormat.mBytesPerFrame = deviceFormat.mChannelsPerFrame * (deviceFormat.mBitsPerChannel / 8);
deviceFormat.mBytesPerPacket = deviceFormat.mBytesPerFrame * deviceFormat.mFramesPerPacket;
/* Set the channel layout for the audio queue */
AudioChannelLayoutTag tag = 0;
switch(deviceFormat.mChannelsPerFrame) {
case 1:
tag = kAudioChannelLayoutTag_Mono;
deviceChannelConfig = AudioConfigMono;
break;
case 2:
tag = kAudioChannelLayoutTag_Stereo;
deviceChannelConfig = AudioConfigStereo;
break;
case 3:
tag = kAudioChannelLayoutTag_DVD_4;
deviceChannelConfig = AudioConfig3Point0;
break;
case 4:
tag = kAudioChannelLayoutTag_Quadraphonic;
deviceChannelConfig = AudioConfig4Point0;
break;
case 5:
tag = kAudioChannelLayoutTag_MPEG_5_0_A;
deviceChannelConfig = AudioConfig5Point0;
break;
case 6:
tag = kAudioChannelLayoutTag_MPEG_5_1_A;
deviceChannelConfig = AudioConfig5Point1;
break;
case 7:
tag = kAudioChannelLayoutTag_MPEG_6_1_A;
deviceChannelConfig = AudioConfig6Point1;
break;
case 8:
tag = kAudioChannelLayoutTag_MPEG_7_1_A;
deviceChannelConfig = AudioConfig7Point1;
break;
}
renderFormat = [[AVAudioFormat alloc] initWithStreamDescription:&deviceFormat channelLayout:[[AVAudioChannelLayout alloc] initWithLayoutTag:tag]];
resetting = YES;
[_au stopHardware];
[_au.inputBusses[0] setFormat:renderFormat error:&err];
if(err != nil)
return NO;
[outputController setFormat:&deviceFormat channelConfig:deviceChannelConfig];
[outputLock lock];
[outputBuffer reset];
[outputLock unlock];
if(started) {
[_au startHardwareAndReturnError:&err];
if(err != nil)
return NO;
}
resetting = NO;
}
return YES;
}
- (void)updateStreamFormat {
/* Set the channel layout for the audio queue */
resetStreamFormat = NO;
uint32_t channels = realStreamFormat.mChannelsPerFrame;
uint32_t channelConfig = realStreamChannelConfig;
streamFormat = realStreamFormat;
streamFormat.mChannelsPerFrame = channels;
streamFormat.mBytesPerFrame = sizeof(float) * channels;
streamFormat.mFramesPerPacket = 1;
streamFormat.mBytesPerPacket = sizeof(float) * channels;
streamChannelConfig = channelConfig;
AudioChannelLayoutTag tag = 0;
AudioChannelLayout layout = { 0 };
switch(streamChannelConfig) {
case AudioConfigMono:
tag = kAudioChannelLayoutTag_Mono;
break;
case AudioConfigStereo:
tag = kAudioChannelLayoutTag_Stereo;
break;
case AudioConfig3Point0:
tag = kAudioChannelLayoutTag_WAVE_3_0;
break;
case AudioConfig4Point0:
tag = kAudioChannelLayoutTag_WAVE_4_0_A;
break;
case AudioConfig5Point0:
tag = kAudioChannelLayoutTag_WAVE_5_0_A;
break;
case AudioConfig5Point1:
tag = kAudioChannelLayoutTag_WAVE_5_1_A;
break;
case AudioConfig6Point1:
tag = kAudioChannelLayoutTag_WAVE_6_1;
break;
case AudioConfig7Point1:
tag = kAudioChannelLayoutTag_WAVE_7_1;
break;
default:
tag = 0;
break;
}
if(tag) {
layout.mChannelLayoutTag = tag;
} else {
layout.mChannelLayoutTag = kAudioChannelLayoutTag_UseChannelBitmap;
layout.mChannelBitmap = streamChannelConfig;
}
}
- (void)renderAndConvert {
if(resetStreamFormat) {
[self updateStreamFormat];
if([self processEndOfStream]) {
return;
}
}
AudioChunk *chunk = [self renderInput:512];
size_t frameCount = 0;
if(chunk && (frameCount = [chunk frameCount])) {
[outputLock lock];
[outputBuffer addChunk:chunk];
[outputLock unlock];
}
if(streamFormatChanged) {
streamFormatChanged = NO;
if(frameCount) {
resetStreamFormat = YES;
} else {
[self updateStreamFormat];
}
}
[self processEndOfStream];
}
- (void)audioOutputBlock {
__block AudioStreamBasicDescription *format = &deviceFormat;
__block void *refCon = (__bridge void *)self;
__block NSLock *refLock = self->outputLock;
__block NSLock *fadersLock = self->fadedBuffersLock;
__block NSMutableArray *faders = self->fadedBuffers;
#ifdef OUTPUT_LOG
__block NSFileHandle *logFile = _logFile;
#endif
_au.outputProvider = ^AUAudioUnitStatus(AudioUnitRenderActionFlags *_Nonnull actionFlags, const AudioTimeStamp *_Nonnull timestamp, AUAudioFrameCount frameCount, NSInteger inputBusNumber, AudioBufferList *_Nonnull inputData) {
if(!frameCount) return 0;
const int channels = format->mChannelsPerFrame;
if(!channels) return 0;
if(!inputData->mNumberBuffers || !inputData->mBuffers[0].mData) return 0;
OutputCoreAudio *_self = (__bridge OutputCoreAudio *)refCon;
int renderedSamples = 0;
inputData->mBuffers[0].mDataByteSize = frameCount * format->mBytesPerPacket;
bzero(inputData->mBuffers[0].mData, inputData->mBuffers[0].mDataByteSize);
inputData->mBuffers[0].mNumberChannels = channels;
if(_self->resetting) {
return 0;
}
float *outSamples = (float*)inputData->mBuffers[0].mData;
@autoreleasepool {
if(!_self->faded) {
while(renderedSamples < frameCount) {
[refLock lock];
AudioChunk *chunk = nil;
if(_self->outputBuffer && ![_self->outputBuffer isEmpty]) {
chunk = [_self->outputBuffer removeSamples:frameCount - renderedSamples];
}
[refLock unlock];
size_t _frameCount = 0;
if(chunk && [chunk frameCount]) {
_self->streamTimestamp = [chunk streamTimestamp];
_frameCount = [chunk frameCount];
NSData *sampleData = [chunk removeSamples:_frameCount];
float *samplePtr = (float *)[sampleData bytes];
size_t inputTodo = MIN(_frameCount, frameCount - renderedSamples);
if(!_self->fading) {
cblas_scopy((int)(inputTodo * channels), samplePtr, 1, outSamples + renderedSamples * channels, 1);
} else {
BOOL faded = fadeAudio(samplePtr, outSamples + renderedSamples * channels, channels, inputTodo, &_self->fadeLevel, _self->fadeStep, _self->fadeTarget);
if(faded) {
if(_self->fadeStep < 0.0) {
_self->faded = YES;
}
_self->fading = NO;
_self->fadeStep = 0.0f;
}
}
renderedSamples += inputTodo;
}
if(_self->stopping || _self->resetting || _self->faded || !chunk || !_frameCount) {
break;
}
}
}
double secondsRendered = (double)renderedSamples / format->mSampleRate;
[fadersLock lock];
for(size_t i = 0; i < [faders count];) {
FadedBuffer *buffer = faders[i];
BOOL stopping = [buffer mix:outSamples sampleCount:frameCount channelCount:channels];
if(stopping) {
[faders removeObjectAtIndex:i];
} else {
++i;
}
}
[fadersLock unlock];
scale_by_volume(outSamples, frameCount * channels, _self->volume);
[_self updateLatency:secondsRendered];
#ifdef OUTPUT_LOG
NSData *outData = [NSData dataWithBytes:outSamples length:frameCount * format->mBytesPerPacket];
[logFile writeData:outData];
#endif
}
#ifdef _DEBUG
[BadSampleCleaner cleanSamples:(float *)inputData->mBuffers[0].mData
amount:inputData->mBuffers[0].mDataByteSize / sizeof(float)
location:@"final output"];
#endif
return 0;
};
}
- (BOOL)setup {
if(_au)
[self stop];
@synchronized(self) {
stopInvoked = NO;
stopCompleted = NO;
commandStop = NO;
shouldPlayOutBuffer = NO;
resetStreamFormat = NO;
streamFormatChanged = NO;
streamFormatStarted = NO;
running = NO;
stopping = NO;
stopped = NO;
paused = NO;
outputDeviceID = -1;
restarted = NO;
cutOffInput = NO;
fadeTarget = 1.0f;
fadeLevel = 1.0f;
fadeStep = 0.0f;
fading = NO;
faded = NO;
AudioComponentDescription desc;
NSError *err;
desc.componentType = kAudioUnitType_Output;
desc.componentSubType = kAudioUnitSubType_HALOutput;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;
_au = [[AUAudioUnit alloc] initWithComponentDescription:desc error:&err];
if(err != nil)
return NO;
// Setup the output device before mucking with settings
NSDictionary *device = [[[NSUserDefaultsController sharedUserDefaultsController] defaults] objectForKey:@"outputDevice"];
if(device) {
BOOL ok = [self setOutputDeviceWithDeviceDict:device];
if(!ok) {
// Ruh roh.
[self setOutputDeviceWithDeviceDict:nil];
[[[NSUserDefaultsController sharedUserDefaultsController] defaults] removeObjectForKey:@"outputDevice"];
}
} else {
[self setOutputDeviceWithDeviceDict:nil];
}
[self audioOutputBlock];
[_au allocateRenderResourcesAndReturnError:&err];
[self updateDeviceFormat];
visController = [VisualizationController sharedController];
[[NSUserDefaultsController sharedUserDefaultsController] addObserver:self forKeyPath:@"values.outputDevice" options:0 context:kOutputCoreAudioContext];
observersapplied = YES;
outputBuffer = [[ChunkList alloc] initWithMaximumDuration:0.5];
if(!outputBuffer) {
return NO;
}
return (err == nil);
}
}
- (void)updateLatency:(double)secondsPlayed {
double visLatency = [outputController getVisLatency];
if(secondsPlayed > 0) {
[outputController setAmountPlayed:streamTimestamp];
}
[visController postLatency:visLatency];
}
- (double)volume {
return volume * 100.0f;
}
- (void)setVolume:(double)v {
volume = v * 0.01f;
}
- (double)latency {
return [outputBuffer listDuration];
}
- (void)start {
[self threadEntry:nil];
}
- (void)stop {
commandStop = YES;
[self doStop];
}
- (void)doStop {
if(stopInvoked) {
return;
}
@synchronized(self) {
stopInvoked = YES;
if(observersapplied) {
[[NSUserDefaultsController sharedUserDefaultsController] removeObserver:self forKeyPath:@"values.outputDevice" context:kOutputCoreAudioContext];
observersapplied = NO;
}
stopping = YES;
paused = NO;
if(defaultdevicelistenerapplied || currentdevicelistenerapplied || devicealivelistenerapplied) {
AudioObjectPropertyAddress theAddress = {
.mScope = kAudioObjectPropertyScopeGlobal,
.mElement = kAudioObjectPropertyElementMaster
};
if(defaultdevicelistenerapplied) {
theAddress.mSelector = kAudioHardwarePropertyDefaultOutputDevice;
AudioObjectRemovePropertyListener(kAudioObjectSystemObject, &theAddress, default_device_changed, (__bridge void *_Nullable)(self));
defaultdevicelistenerapplied = NO;
}
if(devicealivelistenerapplied) {
theAddress.mSelector = kAudioDevicePropertyDeviceIsAlive;
AudioObjectRemovePropertyListener(outputDeviceID, &theAddress, current_device_listener, (__bridge void *_Nullable)(self));
devicealivelistenerapplied = NO;
}
if(currentdevicelistenerapplied) {
theAddress.mSelector = kAudioDevicePropertyStreamFormat;
AudioObjectRemovePropertyListener(outputDeviceID, &theAddress, current_device_listener, (__bridge void *_Nullable)(self));
theAddress.mSelector = kAudioDevicePropertyNominalSampleRate;
AudioObjectRemovePropertyListener(outputDeviceID, &theAddress, current_device_listener, (__bridge void *_Nullable)(self));
currentdevicelistenerapplied = NO;
}
}
if(_au) {
if(shouldPlayOutBuffer && !commandStop) {
double compareVal = 0;
double secondsLatency = [outputController getTotalLatency];
int compareMax = (((1000000 / 5000) * secondsLatency) + (10000 / 5000)); // latency plus 10ms, divide by sleep intervals
do {
compareVal = [outputController getTotalLatency];
usleep(5000);
} while(!commandStop && compareVal > 0 && compareMax-- > 0);
} else {
[self fadeOut];
[fadedBuffersLock lock];
while([fadedBuffers count]) {
[fadedBuffersLock unlock];
usleep(10000);
[fadedBuffersLock lock];
}
[fadedBuffersLock unlock];
}
[_au stopHardware];
_au = nil;
}
if(running) {
while(!stopped) {
stopping = YES;
usleep(5000);
}
}
#ifdef OUTPUT_LOG
if(_logFile) {
[_logFile closeFile];
_logFile = NULL;
}
#endif
outputController = nil;
if(visController) {
[visController reset];
visController = nil;
}
stopCompleted = YES;
}
}
- (void)dealloc {
[self stop];
// In case stop called on another thread first
while(!stopCompleted) {
usleep(500);
}
}
- (void)pause {
paused = YES;
if(started)
[_au stopHardware];
}
- (void)resume {
NSError *err;
[_au startHardwareAndReturnError:&err];
paused = NO;
started = YES;
}
- (void)sustainHDCD {
secondsHdcdSustained = 10.0;
}
- (void)setShouldPlayOutBuffer:(BOOL)s {
shouldPlayOutBuffer = s;
}
- (AudioStreamBasicDescription)deviceFormat {
return deviceFormat;
}
- (uint32_t)deviceChannelConfig {
return deviceChannelConfig;
}
// 125 milliseconds
- (void)fadeOut {
fadeTarget = 0.0f;
fadeStep = ((fadeTarget - fadeLevel) / deviceFormat.mSampleRate) * (1000.0f / 125.0f);
fading = YES;
}
- (void)fadeOutBackground {
cutOffInput = YES;
[outputLock lock];
[fadedBuffersLock lock];
FadedBuffer *buffer = [[FadedBuffer alloc] initWithBuffer:outputBuffer fadeTarget:0.0 sampleRate:deviceFormat.mSampleRate];
outputBuffer = [[ChunkList alloc] initWithMaximumDuration:0.5];
[fadedBuffers addObject:buffer];
[fadedBuffersLock unlock];
[outputLock unlock];
}
- (void)fadeIn {
fadeLevel = 0.0f;
fadeTarget = 1.0f;
fadeStep = ((fadeTarget - fadeLevel) / deviceFormat.mSampleRate) * (1000.0f / 125.0f);
fading = YES;
faded = NO;
cutOffInput = NO;
}
@end

View file

@ -1,11 +1,5 @@
// Plugins! HOORAY!
#if __has_include(<CogAudio/AudioChunk.h>)
# import <CogAudio/AudioChunk.h>
#else
# import "AudioChunk.h"
#endif
@protocol CogSource <NSObject>
+ (NSArray *)schemes; // http, file, etc
@ -48,7 +42,7 @@
- (NSDictionary *)properties;
- (NSDictionary *)metadata; // Only to be implemented for dynamic metadata, send events on change
- (AudioChunk *)readAudio;
- (int)readAudio:(void *)buffer frames:(UInt32)frames;
- (BOOL)open:(id<CogSource>)source;
- (long)seek:(long)frame;
@ -110,10 +104,9 @@
- (int)putMetadataInURL:(NSURL *)url;
@end
#ifdef __cplusplus
extern "C" {
#endif
extern NSString *guess_encoding_of_string(const char *input);
#ifdef __cplusplus
static NSString *guess_encoding_of_string(const char *input) {
NSString *ret = @"";
NSData *stringData = [NSData dataWithBytes:input length:strlen(input)];
[NSString stringEncodingForData:stringData encodingOptions:nil convertedString:&ret usedLossyConversion:nil];
return ret;
}
#endif

View file

@ -2,7 +2,7 @@
#import <Cocoa/Cocoa.h>
#import <CogAudio/Plugin.h>
#import "Plugin.h"
// Singletonish
@interface PluginController : NSObject <CogPluginController> {

View file

@ -31,7 +31,6 @@ static std::map<std::string, Cached_Metadata> Cache_List;
static RedundantPlaylistDataStore *Cache_Data_Store = nil;
static bool Cache_Running = false;
static bool Cache_Stopped = false;
static std::thread *Cache_Thread = NULL;
@ -45,8 +44,6 @@ static void cache_init() {
static void cache_deinit() {
Cache_Running = false;
Cache_Thread->join();
while(!Cache_Stopped)
usleep(500);
delete Cache_Thread;
Cache_Data_Store = nil;
}
@ -138,8 +135,6 @@ static void cache_run() {
std::this_thread::sleep_for(dura);
}
Cache_Stopped = true;
}
@implementation PluginController
@ -201,9 +196,6 @@ static PluginController *sharedPluginController = nil;
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(bundleDidLoad:) name:NSBundleDidLoadNotification object:nil];
[self loadPlugins];
[[NSNotificationCenter defaultCenter] removeObserver:self name:NSBundleDidLoadNotification object:nil];
[self printPluginInfo];
}
}
@ -389,6 +381,8 @@ static NSString *xmlEscapeString(NSString * string) {
<!DOCTYPE plist PUBLIC \"-//Apple//DTD PLIST 1.0//EN\" \"http://www.apple.com/DTDs/PropertyList-1.0.dtd\">\n\
<plist version=\"1.0\">\n\
<dict>\n\
\t<key>FirebaseCrashlyticsCollectionEnabled</key>\n\
\t<false/>\n\
\t<key>SUEnableInstallerLauncherService</key>\n\
\t<true/>\n\
\t<key>CFBundleDevelopmentRegion</key>\n\
@ -420,16 +414,14 @@ static NSString *xmlEscapeString(NSString * string) {
NSString * plistFooter = @"\t</array>\n\
\t<key>CFBundleExecutable</key>\n\
\t<string>Cog</string>\n\
\t<key>CFBundleHelpBookFolder</key>\n\
\t<string>Cog.help</string>\n\
\t<key>CFBundleHelpBookName</key>\n\
\t<string>org.cogx.cog.help</string>\n\
\t<key>CFBundleIdentifier</key>\n\
\t<string>$(PRODUCT_BUNDLE_IDENTIFIER)</string>\n\
\t<key>CFBundleInfoDictionaryVersion</key>\n\
\t<string>6.0</string>\n\
\t<key>CFBundleName</key>\n\
\t<string>$(PRODUCT_NAME)</string>\n\
\t<key>CFBundleDisplayName</key>\n\
\t<string>$(PRODUCT_NAME)</string>\n\
\t<key>CFBundlePackageType</key>\n\
\t<string>APPL</string>\n\
\t<key>CFBundleShortVersionString</key>\n\
@ -467,22 +459,12 @@ static NSString *xmlEscapeString(NSString * string) {
\t<string>MediaKeysApplication</string>\n\
\t<key>NSRemindersUsageDescription</key>\n\
\t<string>Cog has no use for your reminders. Why are you trying to access them with an audio player?</string>\n\
\t<key>NSDownloadsFolderUsageDescription</key>\n\
\t<string>We may request related audio files from this folder for playback purposes. We will only play back files you specifically add, unless you enable the option to add an entire folder. Granting permission either for individual files or for parent folders ensures their contents will remain playable in future sessions.</string>\n\
\t<key>NSDocumentsFolderUsageDescription</key>\n\
\t<string>We may request related audio files from this folder for playback purposes. We will only play back files you specifically add, unless you enable the option to add an entire folder. Granting permission either for individual files or for parent folders ensures their contents will remain playable in future sessions.</string>\n\
\t<key>NSDesktopFolderUsageDescription</key>\n\
\t<string>We may request related audio files from this folder for playback purposes. We will only play back files you specifically add, unless you enable the option to add an entire folder. Granting permission either for individual files or for parent folders ensures their contents will remain playable in future sessions.</string>\n\
\t<key>NSMotionUsageDescription</key>\n\
\t<string>Cog optionally supports motion tracking headphones for head tracked positional audio, using its own low latency positioning model.</string>\n\
\t<key>OSAScriptingDefinition</key>\n\
\t<string>Cog.sdef</string>\n\
\t<key>SUFeedURL</key>\n\
\t<string>https://cogcdn.cog.losno.co/mercury.xml</string>\n\
\t<key>SUPublicEDKey</key>\n\
\t<string>omxG7Rp0XK9/YEvKbVy7cd44eVAh1LJB6CmjQwjOJz4=</string>\n\
\t<key>ITSAppUsesNonExemptEncryption</key>\n\
\t<false/>\n\
</dict>\n\
</plist>\n";
NSMutableArray * decodersRegistered = [[NSMutableArray alloc] init];
@ -648,28 +630,11 @@ static NSString *xmlEscapeString(NSString * string) {
}
}
if(skip && [classString isEqualToString:@"CueSheetDecoder"]) {
classString = @"SilenceDecoder";
}
Class decoder = NSClassFromString(classString);
return [[decoder alloc] init];
}
+ (BOOL)isCoverFile:(NSString *)fileName {
for(NSString *coverFileName in [PluginController coverNames]) {
if([[[[fileName lastPathComponent] stringByDeletingPathExtension] lowercaseString] hasSuffix:coverFileName]) {
return true;
}
}
return false;
}
+ (NSArray *)coverNames {
return @[@"cover", @"folder", @"album", @"front"];
}
- (NSDictionary *)metadataForURL:(NSURL *)url skipCue:(BOOL)skip {
NSString *urlScheme = [url scheme];
if([urlScheme isEqualToString:@"http"] ||
@ -679,7 +644,6 @@ static NSString *xmlEscapeString(NSString * string) {
NSDictionary *cacheData = cache_access_metadata(url);
if(cacheData) return cacheData;
do {
NSString *ext = [url pathExtension];
NSArray *readers = [metadataReaders objectForKey:[ext lowercaseString]];
NSString *classString;
@ -694,61 +658,22 @@ static NSString *xmlEscapeString(NSString * string) {
++i;
}
cacheData = [CogMetadataReaderMulti metadataForURL:url readers:_readers];
break;
cache_insert_metadata(url, cacheData);
return cacheData;
}
cacheData = [CogMetadataReaderMulti metadataForURL:url readers:readers];
break;
cache_insert_metadata(url, cacheData);
return cacheData;
} else {
classString = [readers objectAtIndex:0];
}
} else {
cacheData = nil;
break;
}
if(skip && [classString isEqualToString:@"CueSheetMetadataReader"]) {
cacheData = nil;
break;
return nil;
}
Class metadataReader = NSClassFromString(classString);
cacheData = [metadataReader metadataForURL:url];
} while(0);
if(cacheData == nil) {
cacheData = [NSDictionary dictionary];
}
if(cacheData) {
NSData *image = [cacheData objectForKey:@"albumArt"];
if(nil == image) {
// Try to load image from external file
NSString *path = [[url path] stringByDeletingLastPathComponent];
// Gather list of candidate image files
NSArray *fileNames = [[NSFileManager defaultManager] contentsOfDirectoryAtPath:path error:nil];
NSArray *types = @[@"jpg", @"jpeg", @"png", @"gif", @"webp", @"avif", @"heic"];
NSArray *imageFileNames = [fileNames pathsMatchingExtensions:types];
for(NSString *fileName in imageFileNames) {
if([PluginController isCoverFile:fileName]) {
image = [NSData dataWithContentsOfFile:[path stringByAppendingPathComponent:fileName]];
break;
}
}
if(image) {
NSMutableDictionary *data = [cacheData mutableCopy];
[data setValue:image forKey:@"albumArt"];
cacheData = data;
}
}
}
cache_insert_metadata(url, cacheData);
return cacheData;
}
@ -830,24 +755,3 @@ static NSString *xmlEscapeString(NSString * string) {
}
@end
NSString *guess_encoding_of_string(const char *input) {
NSString *ret = @"";
if(input && *input) {
@try {
ret = [NSString stringWithUTF8String:input];
}
@catch(NSException *e) {
ret = nil;
}
if(!ret) {
// This method is incredibly slow
NSData *stringData = [NSData dataWithBytes:input length:strlen(input)];
[NSString stringEncodingForData:stringData encodingOptions:nil convertedString:&ret usedLossyConversion:nil];
if(!ret) {
ret = @"";
}
}
}
return ret;
}

File diff suppressed because it is too large Load diff

View file

@ -1,36 +0,0 @@
/*
Copyright (C) 2010 Christian Kothe
This program is free software; you can redistribute it and/or
modify it under the terms of the GNU General Public License
as published by the Free Software Foundation; either version 2
of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
*/
#ifndef CHANNELMAPS_H
#define CHANNELMAPS_H
#include "freesurround_decoder.h"
#include <map>
#include <vector>
const int grid_res = 21; // resolution of the lookup grid
// channel allocation maps (per setup)
typedef std::vector<std::vector<float*> > alloc_lut;
extern std::map<unsigned, alloc_lut> chn_alloc;
// channel metadata maps (per setup)
extern std::map<unsigned, std::vector<float> > chn_angle;
extern std::map<unsigned, std::vector<float> > chn_xsf;
extern std::map<unsigned, std::vector<float> > chn_ysf;
extern std::map<unsigned, std::vector<channel_id> > chn_id;
#endif

View file

@ -1,413 +0,0 @@
/*
Copyright (C) 2007-2010 Christian Kothe
This program is free software; you can redistribute it and/or
modify it under the terms of the GNU General Public License
as published by the Free Software Foundation; either version 2
of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
*/
#include "freesurround_decoder.h"
#include "channelmaps.h"
#include <Accelerate/Accelerate.h>
#include <cmath>
#include <vector>
#pragma warning(disable : 4244)
#define pi _pi
const float _pi = 3.141592654f;
const float epsilon = 0.000001f;
using namespace std;
#undef min
#undef max
static void *_memalign_malloc(size_t size, size_t align) {
void *ret = NULL;
if(posix_memalign(&ret, align, size) != 0) {
return NULL;
}
return ret;
}
static void _dsp_complexalloc(DSPDoubleSplitComplex *cpx, int count) {
cpx->realp = (double *)_memalign_malloc(count * sizeof(double), 16);
cpx->imagp = (double *)_memalign_malloc(count * sizeof(double), 16);
}
static void _dsp_complexfree(DSPDoubleSplitComplex *cpx) {
free(cpx->realp);
free(cpx->imagp);
}
// FreeSurround implementation
class decoder_impl {
public:
// instantiate the decoder with a given channel setup and processing block size (in samples)
decoder_impl(channel_setup setup, unsigned N)
: N(N),
wnd(N), inbuf(3 * N), setup(setup), C((unsigned)chn_alloc[setup].size()),
buffer_empty(true), lt(N), rt(N), dst(N), dstf(N),
dftsetupF(vDSP_DFT_zrop_CreateSetupD(0, N, vDSP_DFT_FORWARD)),
dftsetupB(vDSP_DFT_zrop_CreateSetupD(0, N, vDSP_DFT_INVERSE)) {
_dsp_complexalloc(&lf, N/2 + 1);
_dsp_complexalloc(&rf, N/2 + 1);
// allocate per-channel buffers
outbuf.resize((N + N / 2) * C);
signal.resize(C);
for(unsigned k = 0; k < C; k++)
_dsp_complexalloc(&signal[k], N/2 + 1);
// init the window function
for(unsigned k = 0; k < N; k++)
wnd[k] = sqrt(0.5 * (1 - cos(2 * pi * k / N)) / N);
// set default parameters
set_circular_wrap(90);
set_shift(0);
set_depth(1);
set_focus(0);
set_center_image(1);
set_front_separation(1);
set_rear_separation(1);
set_low_cutoff(40.0 / 22050);
set_high_cutoff(90.0 / 22050);
set_bass_redirection(false);
flush();
}
~decoder_impl() {
_dsp_complexfree(&lf);
_dsp_complexfree(&rf);
for(unsigned k = 0; k < C; k++)
_dsp_complexfree(&signal[k]);
vDSP_DFT_DestroySetupD(dftsetupF);
vDSP_DFT_DestroySetupD(dftsetupB);
}
// decode a stereo chunk, produces a multichannel chunk of the same size (lagged)
float *decode(const float *input) {
// append incoming data to the end of the input buffer
memcpy(&inbuf[N], &input[0], 8 * N);
// process first and second half, overlapped
buffered_decode(&inbuf[0]);
buffered_decode(&inbuf[N]);
// shift last half of the input to the beginning (for overlapping with a future block)
memcpy(&inbuf[0], &inbuf[2 * N], 4 * N);
buffer_empty = false;
return &outbuf[0];
}
// flush the internal buffers
void flush() {
memset(&outbuf[0], 0, outbuf.size() * 4);
memset(&inbuf[0], 0, inbuf.size() * 4);
buffer_empty = true;
}
// number of samples currently held in the buffer
unsigned buffered() {
return buffer_empty ? 0 : N / 2;
}
// set soundfield & rendering parameters
void set_circular_wrap(float v) {
circular_wrap = v;
}
void set_shift(float v) {
shift = v;
}
void set_depth(float v) {
depth = v;
}
void set_focus(float v) {
focus = v;
}
void set_center_image(float v) {
center_image = v;
}
void set_front_separation(float v) {
front_separation = v;
}
void set_rear_separation(float v) {
rear_separation = v;
}
void set_low_cutoff(float v) {
lo_cut = v * (N / 2);
}
void set_high_cutoff(float v) {
hi_cut = v * (N / 2);
}
void set_bass_redirection(bool v) {
use_lfe = v;
}
private:
// helper functions
static inline float sqr(double x) {
return x * x;
}
static inline double amplitude(const DSPDoubleSplitComplex &cpx, size_t index) {
return sqrt(sqr(cpx.realp[index]) + sqr(cpx.imagp[index]));
}
static inline double phase(const DSPDoubleSplitComplex &cpx, size_t index) {
return atan2(cpx.imagp[index], cpx.realp[index]);
}
static inline void polar(double a, double p, DSPDoubleSplitComplex &cpx, size_t index) {
cpx.realp[index] = a * cos(p);
cpx.imagp[index] = a * sin(p);
}
static inline float min(double a, double b) {
return a < b ? a : b;
}
static inline float max(double a, double b) {
return a > b ? a : b;
}
static inline float clamp(double x) {
return max(-1, min(1, x));
}
static inline float sign(double x) {
return x < 0 ? -1 : (x > 0 ? 1 : 0);
}
// get the distance of the soundfield edge, along a given angle
static inline double edgedistance(double a) {
return min(sqrt(1 + sqr(tan(a))), sqrt(1 + sqr(1 / tan(a))));
}
// get the index (and fractional offset!) in a piecewise-linear channel allocation grid
int map_to_grid(double &x) {
double gp = ((x + 1) * 0.5) * (grid_res - 1), i = min(grid_res - 2, floor(gp));
x = gp - i;
return i;
}
// decode a block of data and overlap-add it into outbuf
void buffered_decode(const float *input) {
// demultiplex and apply window function
vDSP_vspdp(input, 2, &lt[0], 1, N);
vDSP_vspdp(input + 1, 2, &rt[0], 1, N);
vDSP_vmulD(&lt[0], 1, &wnd[0], 1, &lt[0], 1, N);
vDSP_vmulD(&rt[0], 1, &wnd[0], 1, &rt[0], 1, N);
// map into spectral domain
vDSP_ctozD((DSPDoubleComplex *)(&lt[0]), 2, &lf, 1, N / 2);
vDSP_ctozD((DSPDoubleComplex *)(&rt[0]), 2, &rf, 1, N / 2);
vDSP_DFT_ExecuteD(dftsetupF, lf.realp, lf.imagp, lf.realp, lf.imagp);
vDSP_DFT_ExecuteD(dftsetupF, rf.realp, rf.imagp, rf.realp, rf.imagp);
for(unsigned c = 0; c < C; c++) {
signal[c].realp[0] = 0;
signal[c].imagp[0] = 0;
signal[c].realp[N/2] = 0;
signal[c].imagp[N/2] = 0;
}
bzero(signal[C - 1].realp, sizeof(double) * (N / 2 + 1));
bzero(signal[C - 1].imagp, sizeof(double) * (N / 2 + 1));
// compute multichannel output signal in the spectral domain
for(unsigned f = 1; f < N / 2; f++) {
// get Lt/Rt amplitudes & phases
double ampL = amplitude(lf, f), ampR = amplitude(rf, f);
double phaseL = phase(lf, f), phaseR = phase(rf, f);
// calculate the amplitude & phase differences
double ampDiff = clamp((ampL + ampR < epsilon) ? 0 : (ampR - ampL) / (ampR + ampL));
double phaseDiff = abs(phaseL - phaseR);
if(phaseDiff > pi) phaseDiff = 2 * pi - phaseDiff;
// decode into x/y soundfield position
double x, y;
transform_decode(ampDiff, phaseDiff, x, y);
// add wrap control
transform_circular_wrap(x, y, circular_wrap);
// add shift control
y = clamp(y - shift);
// add depth control
y = clamp(1 - (1 - y) * depth);
// add focus control
transform_focus(x, y, focus);
// add crossfeed control
x = clamp(x * (front_separation * (1 + y) / 2 + rear_separation * (1 - y) / 2));
// get total signal amplitude
double amp_total = sqrt(ampL * ampL + ampR * ampR);
// and total L/C/R signal phases
double phase_of[] = { phaseL, atan2(lf.imagp[f] + rf.imagp[f], lf.realp[f] + rf.realp[f]), phaseR };
// compute 2d channel map indexes p/q and update x/y to fractional offsets in the map grid
int p = map_to_grid(x), q = map_to_grid(y);
// map position to channel volumes
for(unsigned c = 0; c < C - 1; c++) {
// look up channel map at respective position (with bilinear interpolation) and build the signal
const vector<float *> &a = chn_alloc[setup][c];
polar(amp_total * ((1 - x) * (1 - y) * a[q][p] + x * (1 - y) * a[q][p + 1] + (1 - x) * y * a[q + 1][p] + x * y * a[q + 1][p + 1]),
phase_of[1 + (int)sign(chn_xsf[setup][c])], signal[c], f);
}
// optionally redirect bass
if(use_lfe && f < hi_cut) {
// level of LFE channel according to normalized frequency
double lfe_level = f < lo_cut ? 1 : 0.5 * (1 + cos(pi * (f - lo_cut) / (hi_cut - lo_cut)));
// assign LFE channel
polar(amp_total, phase_of[1], signal[C - 1], f);
signal[C - 1].realp[f] *= lfe_level;
signal[C - 1].imagp[f] *= lfe_level;
// subtract the signal from the other channels
for(unsigned c = 0; c < C - 1; c++) {
signal[c].realp[f] *= (1 - lfe_level);
signal[c].imagp[f] *= (1 - lfe_level);
}
}
}
// shift the last 2/3 to the first 2/3 of the output buffer
memmove(&outbuf[0], &outbuf[C * N / 2], N * C * 4);
// and clear the rest
memset(&outbuf[C * N], 0, C * 4 * N / 2);
// backtransform each channel and overlap-add
for(unsigned c = 0; c < C; c++) {
// back-transform into time domain
vDSP_DFT_ExecuteD(dftsetupB, signal[c].realp, signal[c].imagp, signal[c].realp, signal[c].imagp);
vDSP_ztocD(&signal[c], 1, (DSPDoubleComplex *)(&dst[0]), 2, N / 2);
// add the result to the last 2/3 of the output buffer, windowed (and remultiplex)
vDSP_vmulD(&dst[0], 1, &wnd[0], 1, &dst[0], 1, N);
vDSP_vdpsp(&dst[0], 1, &dstf[0], 1, N);
vDSP_vadd(&outbuf[C * N / 2 + c], C, &dstf[0], 1, &outbuf[C * N / 2 + c], C, N);
}
}
// transform amp/phase difference space into x/y soundfield space
void transform_decode(double a, double p, double &x, double &y) {
x = clamp(1.0047 * a + 0.46804 * a * p * p * p - 0.2042 * a * p * p * p * p + 0.0080586 * a * p * p * p * p * p * p * p - 0.0001526 * a * p * p * p * p * p * p * p * p * p * p - 0.073512 * a * a * a * p - 0.2499 * a * a * a * p * p * p * p + 0.016932 * a * a * a * p * p * p * p * p * p * p - 0.00027707 * a * a * a * p * p * p * p * p * p * p * p * p * p + 0.048105 * a * a * a * a * a * p * p * p * p * p * p * p - 0.0065947 * a * a * a * a * a * p * p * p * p * p * p * p * p * p * p + 0.0016006 * a * a * a * a * a * p * p * p * p * p * p * p * p * p * p * p - 0.0071132 * a * a * a * a * a * a * a * p * p * p * p * p * p * p * p * p + 0.0022336 * a * a * a * a * a * a * a * p * p * p * p * p * p * p * p * p * p * p - 0.0004804 * a * a * a * a * a * a * a * p * p * p * p * p * p * p * p * p * p * p * p);
y = clamp(0.98592 - 0.62237 * p + 0.077875 * p * p - 0.0026929 * p * p * p * p * p + 0.4971 * a * a * p - 0.00032124 * a * a * p * p * p * p * p * p + 9.2491e-006 * a * a * a * a * p * p * p * p * p * p * p * p * p * p + 0.051549 * a * a * a * a * a * a * a * a + 1.0727e-014 * a * a * a * a * a * a * a * a * a * a);
}
// apply a circular_wrap transformation to some position
void transform_circular_wrap(double &x, double &y, double refangle) {
if(refangle == 90)
return;
refangle = refangle * pi / 180;
double baseangle = 90 * pi / 180;
// translate into edge-normalized polar coordinates
double ang = atan2(x, y), len = sqrt(x * x + y * y);
len = len / edgedistance(ang);
// apply circular_wrap transform
if(abs(ang) < baseangle / 2)
// angle falls within the front region (to be enlarged)
ang *= refangle / baseangle;
else
// angle falls within the rear region (to be shrunken)
ang = pi - (-(((refangle - 2 * pi) * (pi - abs(ang)) * sign(ang)) / (2 * pi - baseangle)));
// translate back into soundfield position
len = len * edgedistance(ang);
x = clamp(sin(ang) * len);
y = clamp(cos(ang) * len);
}
// apply a focus transformation to some position
void transform_focus(double &x, double &y, double focus) {
if(focus == 0)
return;
// translate into edge-normalized polar coordinates
double ang = atan2(x, y), len = clamp(sqrt(x * x + y * y) / edgedistance(ang));
// apply focus
len = focus > 0 ? 1 - pow(1 - len, 1 + focus * 20) : pow(len, 1 - focus * 20);
// back-transform into euclidian soundfield position
len = len * edgedistance(ang);
x = clamp(sin(ang) * len);
y = clamp(cos(ang) * len);
}
// constants
unsigned N, C; // number of samples per input/output block, number of output channels
channel_setup setup; // the channel setup
// parameters
float circular_wrap; // angle of the front soundstage around the listener (90<39>=default)
float shift; // forward/backward offset of the soundstage
float depth; // backward extension of the soundstage
float focus; // localization of the sound events
float center_image; // presence of the center speaker
float front_separation; // front stereo separation
float rear_separation; // rear stereo separation
float lo_cut, hi_cut; // LFE cutoff frequencies
bool use_lfe; // whether to use the LFE channel
// FFT data structures
vector<double> lt, rt, dst; // left total, right total (source arrays), time-domain destination buffer array
vector<float> dstf; // float conversion destination array
DSPDoubleSplitComplex lf, rf; // left total / right total in frequency domain
vDSP_DFT_SetupD dftsetupF, dftsetupB; // FFT objects
// buffers
bool buffer_empty; // whether the buffer is currently empty or dirty
vector<float> inbuf; // stereo input buffer (multiplexed)
vector<float> outbuf; // multichannel output buffer (multiplexed)
vector<double> wnd; // the window function, precomputed
vector<DSPDoubleSplitComplex> signal; // the signal to be constructed in every channel, in the frequency domain
};
// implementation of the shell class
freesurround_decoder::freesurround_decoder(channel_setup setup, unsigned blocksize)
: impl(new decoder_impl(setup, blocksize)) {
}
freesurround_decoder::~freesurround_decoder() {
delete impl;
}
float *freesurround_decoder::decode(const float *input) {
return impl->decode(input);
}
void freesurround_decoder::flush() {
impl->flush();
}
void freesurround_decoder::circular_wrap(float v) {
impl->set_circular_wrap(v);
}
void freesurround_decoder::shift(float v) {
impl->set_shift(v);
}
void freesurround_decoder::depth(float v) {
impl->set_depth(v);
}
void freesurround_decoder::focus(float v) {
impl->set_focus(v);
}
void freesurround_decoder::center_image(float v) {
impl->set_center_image(v);
}
void freesurround_decoder::front_separation(float v) {
impl->set_front_separation(v);
}
void freesurround_decoder::rear_separation(float v) {
impl->set_rear_separation(v);
}
void freesurround_decoder::low_cutoff(float v) {
impl->set_low_cutoff(v);
}
void freesurround_decoder::high_cutoff(float v) {
impl->set_high_cutoff(v);
}
void freesurround_decoder::bass_redirection(bool v) {
impl->set_bass_redirection(v);
}
unsigned freesurround_decoder::buffered() {
return impl->buffered();
}
unsigned freesurround_decoder::num_channels(channel_setup s) {
return (unsigned)chn_id[s].size();
}
channel_id freesurround_decoder::channel_at(channel_setup s, unsigned i) {
return i < chn_id[s].size() ? chn_id[s][i] : ci_none;
}

View file

@ -1,210 +0,0 @@
/*
Copyright (C) 2007-2010 Christian Kothe
This program is free software; you can redistribute it and/or
modify it under the terms of the GNU General Public License
as published by the Free Software Foundation; either version 2
of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
*/
#ifndef FREESURROUND_DECODER_H
#define FREESURROUND_DECODER_H
/**
* Identifiers for the supported output channels (from front to back, left to right).
* The ordering here also determines the ordering of interleaved samples in the output signal.
*/
typedef enum channel_id {
ci_none = 0,
ci_front_left = 1 << 1,
ci_front_center_left = 1 << 2,
ci_front_center = 1 << 3,
ci_front_center_right = 1 << 4,
ci_front_right = 1 << 5,
ci_side_front_left = 1 << 6,
ci_side_front_right = 1 << 7,
ci_side_center_left = 1 << 8,
ci_side_center_right = 1 << 9,
ci_side_back_left = 1 << 10,
ci_side_back_right = 1 << 11,
ci_back_left = 1 << 12,
ci_back_center_left = 1 << 13,
ci_back_center = 1 << 14,
ci_back_center_right = 1 << 15,
ci_back_right = 1 << 16,
ci_lfe = 1 << 31
} channel_id;
/**
* The supported output channel setups.
* A channel setup is defined by the set of channels that are present. Here is a graphic
* of the cs_5point1 setup: http://en.wikipedia.org/wiki/File:5_1_channels_(surround_sound)_label.svg
*/
typedef enum channel_setup {
cs_stereo = ci_front_left | ci_front_right | ci_lfe,
cs_3stereo = ci_front_left | ci_front_center | ci_front_right | ci_lfe,
cs_5stereo = ci_front_left | ci_front_center_left | ci_front_center | ci_front_center_right | ci_front_right | ci_lfe,
cs_4point1 = ci_front_left | ci_front_right | ci_back_left | ci_back_right | ci_lfe,
cs_5point1 = ci_front_left | ci_front_center | ci_front_right | ci_back_left | ci_back_right | ci_lfe,
cs_6point1 = ci_front_left | ci_front_center | ci_front_right | ci_side_center_left | ci_side_center_right | ci_back_center | ci_lfe,
cs_7point1 = ci_front_left | ci_front_center | ci_front_right | ci_side_center_left | ci_side_center_right | ci_back_left | ci_back_right | ci_lfe,
cs_7point1_panorama = ci_front_left | ci_front_center_left | ci_front_center | ci_front_center_right | ci_front_right |
ci_side_center_left | ci_side_center_right | ci_lfe,
cs_7point1_tricenter = ci_front_left | ci_front_center_left | ci_front_center | ci_front_center_right | ci_front_right |
ci_back_left | ci_back_right | ci_lfe,
cs_8point1 = ci_front_left | ci_front_center | ci_front_right | ci_side_center_left | ci_side_center_right |
ci_back_left | ci_back_center | ci_back_right | ci_lfe,
cs_9point1_densepanorama = ci_front_left | ci_front_center_left | ci_front_center | ci_front_center_right | ci_front_right |
ci_side_front_left | ci_side_front_right | ci_side_center_left | ci_side_center_right | ci_lfe,
cs_9point1_wrap = ci_front_left | ci_front_center_left | ci_front_center | ci_front_center_right | ci_front_right |
ci_side_center_left | ci_side_center_right | ci_back_left | ci_back_right | ci_lfe,
cs_11point1_densewrap = ci_front_left | ci_front_center_left | ci_front_center | ci_front_center_right | ci_front_right |
ci_side_front_left | ci_side_front_right | ci_side_center_left | ci_side_center_right |
ci_side_back_left | ci_side_back_right | ci_lfe,
cs_13point1_totalwrap = ci_front_left | ci_front_center_left | ci_front_center | ci_front_center_right | ci_front_right |
ci_side_front_left | ci_side_front_right | ci_side_center_left | ci_side_center_right |
ci_side_back_left | ci_side_back_right | ci_back_left | ci_back_right | ci_lfe,
cs_16point1 = ci_front_left | ci_front_center_left | ci_front_center | ci_front_center_right | ci_front_right |
ci_side_front_left | ci_side_front_right | ci_side_center_left | ci_side_center_right | ci_side_back_left |
ci_side_back_right | ci_back_left | ci_back_center_left | ci_back_center | ci_back_center_right | ci_back_right | ci_lfe,
cs_legacy = 0 // same channels as cs_5point1 but different upmixing transform; does not support the focus control
} channel_setup;
/**
* The FreeSurround decoder.
*/
class freesurround_decoder {
public:
/**
* Create an instance of the decoder.
* @param setup The output channel setup -- determines the number of output channels
* and their place in the sound field.
* @param blocksize Granularity at which data is processed by the decode() function.
* Must be a power of two and should correspond to ca. 10ms worth of single-channel
* samples (default is 4096 for 44.1Khz data). Do not make it shorter or longer
* than 5ms to 20ms since the granularity at which locations are decoded
* changes with this.
*/
freesurround_decoder(channel_setup setup = cs_5point1, unsigned blocksize = 4096);
~freesurround_decoder();
/**
* Decode a chunk of stereo sound. The output is delayed by half of the blocksize.
* This function is the only one needed for straightforward decoding.
* @param input Contains exactly blocksize (multiplexed) stereo samples, i.e. 2*blocksize numbers.
* @return A pointer to an internal buffer of exactly blocksize (multiplexed) multichannel samples.
* The actual number of values depends on the number of output channels in the chosen
* channel setup.
*/
float *decode(const float *input);
/**
* Flush the internal buffer.
*/
void flush();
// --- soundfield transformations
// These functions allow to set up geometric transformations of the sound field after it has been decoded.
// The sound field is best pictured as a 2-dimensional square with the listener in its
// center which can be shifted or stretched in various ways before it is sent to the
// speakers. The order in which these transformations are applied is as listed below.
/**
* Allows to wrap the soundfield around the listener in a circular manner.
* Determines the angle of the frontal sound stage relative to the listener, in degrees.
* A setting of 90<EFBFBD> corresponds to standard surround decoding, 180<EFBFBD> stretches the front stage from
* ear to ear, 270<EFBFBD> wraps it around most of the head. The side and rear content of the sound
* field is compressed accordingly behind the listerer. (default: 90, range: [0<EFBFBD>..360<EFBFBD>])
*/
void circular_wrap(float v);
/**
* Allows to shift the soundfield forward or backward.
* Value range: [-1.0..+1.0]. 0 is no offset, positive values move the sound
* forward, negative values move it backwards. (default: 0)
*/
void shift(float v);
/**
* Allows to scale the soundfield backwards.
* Value range: [0.0..+5.0] -- 0 is all compressed to the front, 1 is no change, 5 is scaled 5x backwards (default: 1)
*/
void depth(float v);
/**
* Allows to control the localization (i.e., focality) of sources.
* Value range: [-1.0..+1.0] -- 0 means unchanged, positive means more localized, negative means more ambient (default: 0)
*/
void focus(float v);
// --- rendering parameters
// These parameters control how the sound field is mapped onto speakers.
/**
* Set the presence of the front center channel(s).
* Value range: [0.0..1.0] -- fully present at 1.0, fully replaced by left/right at 0.0 (default: 1).
* The default of 1.0 results in spec-conformant decoding ("movie mode") while a value of 0.7 is
* better suited for music reproduction (which is usually mixed without a center channel).
*/
void center_image(float v);
/**
* Set the front stereo separation.
* Value range: [0.0..inf] -- 1.0 is default, 0.0 is mono.
*/
void front_separation(float v);
/**
* Set the rear stereo separation.
* Value range: [0.0..inf] -- 1.0 is default, 0.0 is mono.
*/
void rear_separation(float v);
// --- bass redirection (to LFE)
/**
* Enable/disable LFE channel (default: false = disabled)
*/
void bass_redirection(bool v);
/**
* Set the lower end of the transition band, in Hz/Nyquist (default: 40/22050).
*/
void low_cutoff(float v);
/**
* Set the upper end of the transition band, in Hz/Nyquist (default: 90/22050).
*/
void high_cutoff(float v);
// --- info
/**
* Number of samples currently held in the buffer.
*/
unsigned buffered();
/**
* Number of channels in the given setup.
*/
static unsigned num_channels(channel_setup s);
/**
* Channel id of the i'th channel in the given setup.
*/
static channel_id channel_at(channel_setup s, unsigned i);
private:
class decoder_impl *impl; // private implementation
};
#endif

View file

@ -1,5 +1,5 @@
/*
Copyright (C) 2010-2023, Christopher Snowhill,
Copyright (C) 2010-2022, Christopher Snowhill,
All rights reserved.
Optimizations by Gumboot
Additional work by Burt P.

View file

@ -1,25 +0,0 @@
#pragma once
// The functions provide little endianness to native endianness conversion and back again
#if(defined(_MSC_VER) && defined(_WIN32)) || defined(__APPLE__)
template <typename T>
inline void from_little_endian_inplace(T& x) {
}
template <typename T>
inline T from_little_endian(T x) {
return x;
}
template <typename T>
inline void to_little_endian_inplace(T& x) {
}
template <typename T>
inline T to_little_endian(T x) {
return x;
}
#else
#error "Specify endianness conversion for your platform"
#endif

View file

@ -1,641 +0,0 @@
#include "HrtfData.h"
#include "Endianness.h"
#include <algorithm>
#include <cassert>
#include <cmath>
typedef struct {
uint8_t bytes[3];
} sample_int24_t;
const double pi = M_PI;
template <typename T>
void read_stream(std::istream& stream, T& value) {
stream.read(reinterpret_cast<std::istream::char_type*>(&value), sizeof(value));
from_little_endian_inplace(value);
}
HrtfData::HrtfData(std::istream& stream) {
const char required_magic00[] = { 'M', 'i', 'n', 'P', 'H', 'R', '0', '0' };
const char required_magic01[] = { 'M', 'i', 'n', 'P', 'H', 'R', '0', '1' };
const char required_magic02[] = { 'M', 'i', 'n', 'P', 'H', 'R', '0', '2' };
const char required_magic03[] = { 'M', 'i', 'n', 'P', 'H', 'R', '0', '3' };
char actual_magic[sizeof(required_magic03) / sizeof(required_magic03[0])];
stream.read(actual_magic, sizeof(actual_magic));
if(std::equal(std::begin(required_magic03), std::end(required_magic03), std::begin(actual_magic), std::end(actual_magic))) {
LoadHrtf03(stream);
} else if(std::equal(std::begin(required_magic02), std::end(required_magic02), std::begin(actual_magic), std::end(actual_magic))) {
LoadHrtf02(stream);
} else if(std::equal(std::begin(required_magic01), std::end(required_magic01), std::begin(actual_magic), std::end(actual_magic))) {
LoadHrtf01(stream);
} else if(std::equal(std::begin(required_magic00), std::end(required_magic00), std::begin(actual_magic), std::end(actual_magic))) {
LoadHrtf00(stream);
} else {
throw std::logic_error("Bad file format.");
}
}
void HrtfData::LoadHrtf03(std::istream& stream) {
// const uint8_t ChanType_LeftOnly{0};
const uint8_t ChanType_LeftRight{ 1 };
uint32_t sample_rate;
uint8_t channel_type;
uint8_t impulse_response_length;
uint8_t distances_count;
read_stream(stream, sample_rate);
read_stream(stream, channel_type);
read_stream(stream, impulse_response_length);
read_stream(stream, distances_count);
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
if(channel_type > ChanType_LeftRight) {
throw std::logic_error("Invalid channel format.");
}
int channel_count = channel_type == ChanType_LeftRight ? 2 : 1;
std::vector<DistanceData> distances(distances_count);
for(uint8_t i = 0; i < distances_count; i++) {
uint16_t distance;
read_stream(stream, distance);
distances[i].distance = float(distance) / 1000.0f;
uint8_t elevations_count;
read_stream(stream, elevations_count);
distances[i].elevations.resize(elevations_count);
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
for(uint8_t j = 0; j < elevations_count; j++) {
uint8_t azimuth_count;
read_stream(stream, azimuth_count);
distances[i].elevations[j].azimuths.resize(azimuth_count);
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
}
const float normalization_factor = 1.0f / 8388608.0f;
for(auto& distance : distances) {
for(auto& elevation : distance.elevations) {
for(auto& azimuth : elevation.azimuths) {
azimuth.impulse_response.resize(impulse_response_length * channel_count);
for(auto& sample : azimuth.impulse_response) {
union {
sample_int24_t sample;
int32_t sample_int;
} sample_union;
sample_union.sample_int = 0;
read_stream(stream, sample_union.sample);
sample_union.sample_int <<= 8;
sample_union.sample_int >>= 8;
sample = sample_union.sample_int * normalization_factor;
}
}
}
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
uint8_t longest_delay = 0;
for(auto& distance : distances) {
for(auto& elevation : distance.elevations) {
for(auto& azimuth : elevation.azimuths) {
uint8_t delay;
read_stream(stream, delay);
azimuth.delay = delay;
longest_delay = std::max(longest_delay, delay);
if(channel_type == ChanType_LeftRight) {
read_stream(stream, delay);
azimuth.delay_right = delay;
longest_delay = std::max(longest_delay, delay);
}
}
}
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
std::sort(distances.begin(), distances.end(),
[](const DistanceData& lhs, const DistanceData& rhs) noexcept { return lhs.distance > rhs.distance; });
m_distances = std::move(distances);
m_channel_count = channel_count;
m_response_length = impulse_response_length;
m_sample_rate = sample_rate;
m_longest_delay = longest_delay;
}
void HrtfData::LoadHrtf02(std::istream& stream) {
// const uint8_t SampleType_S16{0};
const uint8_t SampleType_S24{ 1 };
// const uint8_t ChanType_LeftOnly{0};
const uint8_t ChanType_LeftRight{ 1 };
uint32_t sample_rate;
uint8_t sample_type;
uint8_t channel_type;
uint8_t impulse_response_length;
uint8_t distances_count;
read_stream(stream, sample_rate);
read_stream(stream, sample_type);
read_stream(stream, channel_type);
read_stream(stream, impulse_response_length);
read_stream(stream, distances_count);
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
if(sample_type > SampleType_S24) {
throw std::logic_error("Invalid sample type.");
}
if(channel_type > ChanType_LeftRight) {
throw std::logic_error("Invalid channel format.");
}
int channel_count = channel_type == ChanType_LeftRight ? 2 : 1;
std::vector<DistanceData> distances(distances_count);
for(uint8_t i = 0; i < distances_count; i++) {
uint16_t distance;
read_stream(stream, distance);
distances[i].distance = float(distance) / 1000.0f;
uint8_t elevations_count;
read_stream(stream, elevations_count);
distances[i].elevations.resize(elevations_count);
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
for(uint8_t j = 0; j < elevations_count; j++) {
uint8_t azimuth_count;
read_stream(stream, azimuth_count);
distances[i].elevations[j].azimuths.resize(azimuth_count);
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
}
const float normalization_factor = (sample_type == SampleType_S24) ? 1.0f / 8388608.0f : 1.0f / 32768.0f;
for(auto& distance : distances) {
for(auto& elevation : distance.elevations) {
for(auto& azimuth : elevation.azimuths) {
azimuth.impulse_response.resize(impulse_response_length * channel_count);
if(sample_type == SampleType_S24) {
for(auto& sample : azimuth.impulse_response) {
union {
sample_int24_t sample;
int32_t sample_int;
} sample_union;
sample_union.sample_int = 0;
read_stream(stream, sample_union.sample);
sample_union.sample_int <<= 8;
sample_union.sample_int >>= 8;
sample = sample_union.sample_int * normalization_factor;
}
} else {
for(auto& sample : azimuth.impulse_response) {
int16_t sample_from_file;
read_stream(stream, sample_from_file);
sample = sample_from_file * normalization_factor;
}
}
}
}
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
uint8_t longest_delay = 0;
for(auto& distance : distances) {
for(auto& elevation : distance.elevations) {
for(auto& azimuth : elevation.azimuths) {
uint8_t delay;
read_stream(stream, delay);
azimuth.delay = delay;
longest_delay = std::max(longest_delay, delay);
if(channel_type == ChanType_LeftRight) {
read_stream(stream, delay);
azimuth.delay_right = delay;
longest_delay = std::max(longest_delay, delay);
}
}
}
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
std::sort(distances.begin(), distances.end(),
[](const DistanceData& lhs, const DistanceData& rhs) noexcept { return lhs.distance > rhs.distance; });
m_distances = std::move(distances);
m_channel_count = channel_count;
m_response_length = impulse_response_length;
m_sample_rate = sample_rate;
m_longest_delay = longest_delay;
}
void HrtfData::LoadHrtf01(std::istream& stream) {
uint32_t sample_rate;
uint8_t impulse_response_length;
read_stream(stream, sample_rate);
read_stream(stream, impulse_response_length);
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
std::vector<DistanceData> distances(1);
distances[0].distance = 1.0;
uint8_t elevations_count;
read_stream(stream, elevations_count);
distances[0].elevations.resize(elevations_count);
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
for(uint8_t i = 0; i < elevations_count; i++) {
uint8_t azimuth_count;
read_stream(stream, azimuth_count);
distances[0].elevations[i].azimuths.resize(azimuth_count);
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
const float normalization_factor = 1.0f / 32768.0f;
for(auto& elevation : distances[0].elevations) {
for(auto& azimuth : elevation.azimuths) {
azimuth.impulse_response.resize(impulse_response_length);
for(auto& sample : azimuth.impulse_response) {
int16_t sample_from_file;
read_stream(stream, sample_from_file);
sample = sample_from_file * normalization_factor;
}
}
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
uint8_t longest_delay = 0;
for(auto& elevation : distances[0].elevations) {
for(auto& azimuth : elevation.azimuths) {
uint8_t delay;
read_stream(stream, delay);
delay <<= 2;
azimuth.delay = delay;
longest_delay = std::max(longest_delay, delay);
}
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
m_distances = std::move(distances);
m_channel_count = 1;
m_response_length = impulse_response_length;
m_sample_rate = sample_rate;
m_longest_delay = longest_delay;
}
void HrtfData::LoadHrtf00(std::istream& stream) {
uint32_t sample_rate;
uint16_t impulse_response_count;
uint16_t impulse_response_length;
read_stream(stream, sample_rate);
read_stream(stream, impulse_response_count);
read_stream(stream, impulse_response_length);
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
std::vector<DistanceData> distances(1);
distances[0].distance = 1.0;
uint8_t elevations_count;
read_stream(stream, elevations_count);
distances[0].elevations.resize(elevations_count);
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
std::vector<uint16_t> irOffsets(elevations_count);
for(uint8_t i = 0; i < elevations_count; i++) {
read_stream(stream, irOffsets[i]);
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
for(size_t i = 1; i < elevations_count; i++) {
if(irOffsets[i] <= irOffsets[i - 1]) {
throw std::logic_error("Invalid elevation offset.");
}
}
if(impulse_response_count <= irOffsets[elevations_count - 1]) {
throw std::logic_error("Invalid elevation offset.");
}
for(size_t i = 1; i < elevations_count; i++) {
distances[0].elevations[i - 1].azimuths.resize(irOffsets[i] - irOffsets[i - 1]);
}
distances[0].elevations[elevations_count - 1].azimuths.resize(impulse_response_count - irOffsets[elevations_count - 1]);
const float normalization_factor = 1.0f / 32768.0f;
for(auto& elevation : distances[0].elevations) {
for(auto& azimuth : elevation.azimuths) {
azimuth.impulse_response.resize(impulse_response_length);
for(auto& sample : azimuth.impulse_response) {
int16_t sample_from_file;
read_stream(stream, sample_from_file);
sample = sample_from_file * normalization_factor;
}
}
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
uint8_t longest_delay = 0;
for(auto& elevation : distances[0].elevations) {
for(auto& azimuth : elevation.azimuths) {
uint8_t delay;
read_stream(stream, delay);
delay <<= 2;
azimuth.delay = delay;
longest_delay = std::max(longest_delay, delay);
}
}
if(!stream || stream.eof()) {
throw std::logic_error("Failed reading file.");
}
m_distances = std::move(distances);
m_channel_count = 1;
m_response_length = impulse_response_length;
m_sample_rate = sample_rate;
m_longest_delay = longest_delay;
}
void HrtfData::get_direction_data(angle_t elevation, angle_t azimuth, distance_t distance, uint32_t channel, DirectionData& ref_data) const {
assert(elevation >= -angle_t(pi * 0.5));
assert(elevation <= angle_t(pi * 0.5));
assert(azimuth >= -angle_t(2.0 * pi));
assert(azimuth <= angle_t(2.0 * pi));
const float azimuth_mod = std::fmod(azimuth + angle_t(pi * 2.0), angle_t(pi * 2.0));
size_t distance_index0 = 0;
while(distance_index0 < m_distances.size() - 1 &&
m_distances[distance_index0].distance > distance) {
distance_index0++;
}
const size_t distance_index1 = std::min(distance_index0 + 1, m_distances.size() - 1);
const distance_t distance0 = m_distances[distance_index0].distance;
const distance_t distance1 = m_distances[distance_index1].distance;
const distance_t distance_delta = distance0 - distance1;
const float distance_fractional_part = distance_delta ? (distance - distance1) / distance_delta : 0;
const auto& elevations0 = m_distances[distance_index0].elevations;
const auto& elevations1 = m_distances[distance_index1].elevations;
const angle_t elevation_scaled0 = (elevation + angle_t(pi * 0.5)) * (elevations0.size() - 1) / angle_t(pi);
const angle_t elevation_scaled1 = (elevation + angle_t(pi * 0.5)) * (elevations1.size() - 1) / angle_t(pi);
const size_t elevation_index00 = static_cast<size_t>(elevation_scaled0);
const size_t elevation_index10 = static_cast<size_t>(elevation_scaled1);
const size_t elevation_index01 = std::min(elevation_index00 + 1, elevations0.size() - 1);
const size_t elevation_index11 = std::min(elevation_index10 + 1, elevations1.size() - 1);
const float elevation_fractional_part0 = std::fmod(elevation_scaled0, 1.0);
const float elevation_fractional_part1 = std::fmod(elevation_scaled1, 1.0);
const angle_t azimuth_scaled00 = azimuth_mod * elevations0[elevation_index00].azimuths.size() / angle_t(2 * pi);
const size_t azimuth_index000 = static_cast<size_t>(azimuth_scaled00) % elevations0[elevation_index00].azimuths.size();
const size_t azimuth_index001 = static_cast<size_t>(azimuth_scaled00 + 1) % elevations0[elevation_index00].azimuths.size();
const float azimuth_fractional_part00 = std::fmod(azimuth_scaled00, 1.0);
const angle_t azimuth_scaled10 = azimuth_mod * elevations1[elevation_index10].azimuths.size() / angle_t(2 * pi);
const size_t azimuth_index100 = static_cast<size_t>(azimuth_scaled10) % elevations1[elevation_index10].azimuths.size();
const size_t azimuth_index101 = static_cast<size_t>(azimuth_scaled10 + 1) % elevations1[elevation_index10].azimuths.size();
const float azimuth_fractional_part10 = std::fmod(azimuth_scaled10, 1.0);
const angle_t azimuth_scaled01 = azimuth_mod * elevations0[elevation_index01].azimuths.size() / angle_t(2 * pi);
const size_t azimuth_index010 = static_cast<size_t>(azimuth_scaled01) % elevations0[elevation_index01].azimuths.size();
const size_t azimuth_index011 = static_cast<size_t>(azimuth_scaled01 + 1) % elevations0[elevation_index01].azimuths.size();
const float azimuth_fractional_part01 = std::fmod(azimuth_scaled01, 1.0);
const angle_t azimuth_scaled11 = azimuth_mod * elevations1[elevation_index11].azimuths.size() / angle_t(2 * pi);
const size_t azimuth_index110 = static_cast<size_t>(azimuth_scaled11) % elevations1[elevation_index11].azimuths.size();
const size_t azimuth_index111 = static_cast<size_t>(azimuth_scaled11 + 1) % elevations1[elevation_index11].azimuths.size();
const float azimuth_fractional_part11 = std::fmod(azimuth_scaled11, 1.0);
const float blend_factor_000 = (1.0f - elevation_fractional_part0) * (1.0f - azimuth_fractional_part00) * distance_fractional_part;
const float blend_factor_001 = (1.0f - elevation_fractional_part0) * azimuth_fractional_part00 * distance_fractional_part;
const float blend_factor_010 = elevation_fractional_part0 * (1.0f - azimuth_fractional_part01) * distance_fractional_part;
const float blend_factor_011 = elevation_fractional_part0 * azimuth_fractional_part01 * distance_fractional_part;
const float blend_factor_100 = (1.0f - elevation_fractional_part1) * (1.0f - azimuth_fractional_part10) * (1.0f - distance_fractional_part);
const float blend_factor_101 = (1.0f - elevation_fractional_part1) * azimuth_fractional_part10 * (1.0f - distance_fractional_part);
const float blend_factor_110 = elevation_fractional_part1 * (1.0f - azimuth_fractional_part11) * (1.0f - distance_fractional_part);
const float blend_factor_111 = elevation_fractional_part1 * azimuth_fractional_part11 * (1.0f - distance_fractional_part);
delay_t delay0;
delay_t delay1;
if(channel == 0) {
delay0 =
elevations0[elevation_index00].azimuths[azimuth_index000].delay * blend_factor_000 + elevations0[elevation_index00].azimuths[azimuth_index001].delay * blend_factor_001 + elevations0[elevation_index01].azimuths[azimuth_index010].delay * blend_factor_010 + elevations0[elevation_index01].azimuths[azimuth_index011].delay * blend_factor_011;
delay1 =
elevations1[elevation_index10].azimuths[azimuth_index100].delay * blend_factor_100 + elevations1[elevation_index10].azimuths[azimuth_index101].delay * blend_factor_101 + elevations1[elevation_index11].azimuths[azimuth_index110].delay * blend_factor_110 + elevations1[elevation_index11].azimuths[azimuth_index111].delay * blend_factor_111;
} else {
delay0 =
elevations0[elevation_index00].azimuths[azimuth_index000].delay_right * blend_factor_000 + elevations0[elevation_index00].azimuths[azimuth_index001].delay_right * blend_factor_001 + elevations0[elevation_index01].azimuths[azimuth_index010].delay_right * blend_factor_010 + elevations0[elevation_index01].azimuths[azimuth_index011].delay_right * blend_factor_011;
delay1 =
elevations1[elevation_index10].azimuths[azimuth_index100].delay_right * blend_factor_100 + elevations1[elevation_index10].azimuths[azimuth_index101].delay_right * blend_factor_101 + elevations1[elevation_index11].azimuths[azimuth_index110].delay_right * blend_factor_110 + elevations1[elevation_index11].azimuths[azimuth_index111].delay_right * blend_factor_111;
}
ref_data.delay = delay0 + delay1;
if(ref_data.impulse_response.size() < m_response_length)
ref_data.impulse_response.resize(m_response_length);
for(size_t i = 0, j = channel; i < m_response_length; i++, j += m_channel_count) {
float sample0 =
elevations0[elevation_index00].azimuths[azimuth_index000].impulse_response[j] * blend_factor_000 + elevations0[elevation_index00].azimuths[azimuth_index001].impulse_response[j] * blend_factor_001 + elevations0[elevation_index01].azimuths[azimuth_index010].impulse_response[j] * blend_factor_010 + elevations0[elevation_index01].azimuths[azimuth_index011].impulse_response[j] * blend_factor_011;
float sample1 =
elevations1[elevation_index10].azimuths[azimuth_index100].impulse_response[j] * blend_factor_100 + elevations1[elevation_index10].azimuths[azimuth_index101].impulse_response[j] * blend_factor_101 + elevations1[elevation_index11].azimuths[azimuth_index110].impulse_response[j] * blend_factor_110 + elevations1[elevation_index11].azimuths[azimuth_index111].impulse_response[j] * blend_factor_111;
ref_data.impulse_response[i] = sample0 + sample1;
}
}
void HrtfData::get_direction_data(angle_t elevation, angle_t azimuth, distance_t distance, DirectionData& ref_data_left, DirectionData& ref_data_right) const {
assert(elevation >= -angle_t(pi * 0.5));
assert(elevation <= angle_t(pi * 0.5));
assert(azimuth >= -angle_t(2.0 * pi));
assert(azimuth <= angle_t(2.0 * pi));
get_direction_data(elevation, azimuth, distance, 0, ref_data_left);
if(m_channel_count == 1) {
get_direction_data(elevation, -azimuth, distance, 0, ref_data_right);
} else {
get_direction_data(elevation, azimuth, distance, 1, ref_data_right);
}
}
void HrtfData::sample_direction(angle_t elevation, angle_t azimuth, distance_t distance, uint32_t sample, uint32_t channel, float& value, float& delay) const {
assert(elevation >= -angle_t(pi * 0.5));
assert(elevation <= angle_t(pi * 0.5));
assert(azimuth >= -angle_t(2.0 * pi));
assert(azimuth <= angle_t(2.0 * pi));
size_t distance_index0 = 0;
while(distance_index0 < m_distances.size() - 1 &&
m_distances[distance_index0].distance > distance) {
distance_index0++;
}
const size_t distance_index1 = std::min(distance_index0 + 1, m_distances.size() - 1);
const distance_t distance0 = m_distances[distance_index0].distance;
const distance_t distance1 = m_distances[distance_index1].distance;
const distance_t distance_delta = distance0 - distance1;
const float distance_fractional_part = distance_delta ? (distance - distance1) / distance_delta : 0;
const auto& elevations0 = m_distances[distance_index0].elevations;
const auto& elevations1 = m_distances[distance_index1].elevations;
const float azimuth_mod = std::fmod(azimuth + angle_t(pi * 2.0), angle_t(pi * 2.0));
const angle_t elevation_scaled0 = (elevation + angle_t(pi * 0.5)) * (elevations0.size() - 1) / angle_t(pi);
const angle_t elevation_scaled1 = (elevation + angle_t(pi * 0.5)) * (elevations1.size() - 1) / angle_t(pi);
const size_t elevation_index00 = static_cast<size_t>(elevation_scaled0);
const size_t elevation_index10 = static_cast<size_t>(elevation_scaled1);
const size_t elevation_index01 = std::min(elevation_index00 + 1, elevations0.size() - 1);
const size_t elevation_index11 = std::min(elevation_index10 + 1, elevations1.size() - 1);
const float elevation_fractional_part0 = std::fmod(elevation_scaled0, 1.0);
const float elevation_fractional_part1 = std::fmod(elevation_scaled1, 1.0);
const angle_t azimuth_scaled00 = azimuth_mod * elevations0[elevation_index00].azimuths.size() / angle_t(2 * pi);
const size_t azimuth_index000 = static_cast<size_t>(azimuth_scaled00) % elevations0[elevation_index00].azimuths.size();
const size_t azimuth_index001 = static_cast<size_t>(azimuth_scaled00 + 1) % elevations0[elevation_index00].azimuths.size();
const float azimuth_fractional_part00 = std::fmod(azimuth_scaled00, 1.0);
const angle_t azimuth_scaled10 = azimuth_mod * elevations1[elevation_index10].azimuths.size() / angle_t(2 * pi);
const size_t azimuth_index100 = static_cast<size_t>(azimuth_scaled10) % elevations1[elevation_index10].azimuths.size();
const size_t azimuth_index101 = static_cast<size_t>(azimuth_scaled10 + 1) % elevations1[elevation_index10].azimuths.size();
const float azimuth_fractional_part10 = std::fmod(azimuth_scaled10, 1.0);
const angle_t azimuth_scaled01 = azimuth_mod * elevations0[elevation_index01].azimuths.size() / angle_t(2 * pi);
const size_t azimuth_index010 = static_cast<size_t>(azimuth_scaled01) % elevations0[elevation_index01].azimuths.size();
const size_t azimuth_index011 = static_cast<size_t>(azimuth_scaled01 + 1) % elevations0[elevation_index01].azimuths.size();
const float azimuth_fractional_part01 = std::fmod(azimuth_scaled01, 1.0);
const angle_t azimuth_scaled11 = azimuth_mod * elevations1[elevation_index11].azimuths.size() / angle_t(2 * pi);
const size_t azimuth_index110 = static_cast<size_t>(azimuth_scaled11) % elevations1[elevation_index11].azimuths.size();
const size_t azimuth_index111 = static_cast<size_t>(azimuth_scaled11 + 1) % elevations1[elevation_index11].azimuths.size();
const float azimuth_fractional_part11 = std::fmod(azimuth_scaled11, 1.0);
const float blend_factor_000 = (1.0f - elevation_fractional_part0) * (1.0f - azimuth_fractional_part00) * distance_fractional_part;
const float blend_factor_001 = (1.0f - elevation_fractional_part0) * azimuth_fractional_part00 * distance_fractional_part;
const float blend_factor_010 = elevation_fractional_part0 * (1.0f - azimuth_fractional_part01) * distance_fractional_part;
const float blend_factor_011 = elevation_fractional_part0 * azimuth_fractional_part01 * distance_fractional_part;
const float blend_factor_100 = (1.0f - elevation_fractional_part1) * (1.0f - azimuth_fractional_part10) * (1.0f - distance_fractional_part);
const float blend_factor_101 = (1.0f - elevation_fractional_part1) * azimuth_fractional_part10 * (1.0f - distance_fractional_part);
const float blend_factor_110 = elevation_fractional_part1 * (1.0f - azimuth_fractional_part11) * (1.0f - distance_fractional_part);
const float blend_factor_111 = elevation_fractional_part1 * azimuth_fractional_part11 * (1.0f - distance_fractional_part);
float delay0;
float delay1;
if(channel == 0) {
delay0 =
elevations0[elevation_index00].azimuths[azimuth_index000].delay * blend_factor_000 + elevations0[elevation_index00].azimuths[azimuth_index001].delay * blend_factor_001 + elevations0[elevation_index01].azimuths[azimuth_index010].delay * blend_factor_010 + elevations0[elevation_index01].azimuths[azimuth_index011].delay * blend_factor_011;
delay1 =
elevations1[elevation_index10].azimuths[azimuth_index100].delay * blend_factor_100 + elevations1[elevation_index10].azimuths[azimuth_index101].delay * blend_factor_101 + elevations1[elevation_index11].azimuths[azimuth_index110].delay * blend_factor_110 + elevations1[elevation_index11].azimuths[azimuth_index111].delay * blend_factor_111;
} else {
delay0 =
elevations0[elevation_index00].azimuths[azimuth_index000].delay_right * blend_factor_000 + elevations0[elevation_index00].azimuths[azimuth_index001].delay_right * blend_factor_001 + elevations0[elevation_index01].azimuths[azimuth_index010].delay_right * blend_factor_010 + elevations0[elevation_index01].azimuths[azimuth_index011].delay_right * blend_factor_011;
delay1 =
elevations1[elevation_index10].azimuths[azimuth_index100].delay_right * blend_factor_100 + elevations1[elevation_index10].azimuths[azimuth_index101].delay_right * blend_factor_101 + elevations1[elevation_index11].azimuths[azimuth_index110].delay_right * blend_factor_110 + elevations1[elevation_index11].azimuths[azimuth_index111].delay_right * blend_factor_111;
}
delay = delay0 + delay1;
sample = sample * m_channel_count + channel;
float value0 =
elevations0[elevation_index00].azimuths[azimuth_index000].impulse_response[sample] * blend_factor_000 + elevations0[elevation_index00].azimuths[azimuth_index001].impulse_response[sample] * blend_factor_001 + elevations0[elevation_index01].azimuths[azimuth_index010].impulse_response[sample] * blend_factor_010 + elevations0[elevation_index01].azimuths[azimuth_index011].impulse_response[sample] * blend_factor_011;
float value1 =
elevations1[elevation_index10].azimuths[azimuth_index100].impulse_response[sample] * blend_factor_100 + elevations1[elevation_index10].azimuths[azimuth_index101].impulse_response[sample] * blend_factor_101 + elevations1[elevation_index11].azimuths[azimuth_index110].impulse_response[sample] * blend_factor_110 + elevations1[elevation_index11].azimuths[azimuth_index111].impulse_response[sample] * blend_factor_111;
value = value0 + value1;
}
void HrtfData::sample_direction(angle_t elevation, angle_t azimuth, distance_t distance, uint32_t sample, float& value_left, float& delay_left, float& value_right, float& delay_right) const {
assert(elevation >= -angle_t(pi * 0.5));
assert(elevation <= angle_t(pi * 0.5));
assert(azimuth >= -angle_t(2.0 * pi));
assert(azimuth <= angle_t(2.0 * pi));
sample_direction(elevation, azimuth, distance, sample, 0, value_left, delay_left);
if(m_channel_count == 1) {
sample_direction(elevation, -azimuth, distance, sample, 0, value_right, delay_right);
} else {
sample_direction(elevation, azimuth, distance, sample, 1, value_right, delay_right);
}
}

View file

@ -1,48 +0,0 @@
#pragma once
#include "HrtfTypes.h"
#include "IHrtfData.h"
#include <cstdint>
#include <iostream>
#include <vector>
struct ElevationData {
std::vector<DirectionData> azimuths;
};
struct DistanceData {
distance_t distance;
std::vector<ElevationData> elevations;
};
class HrtfData : public IHrtfData {
void LoadHrtf00(std::istream& stream);
void LoadHrtf01(std::istream& stream);
void LoadHrtf02(std::istream& stream);
void LoadHrtf03(std::istream& stream);
public:
HrtfData(std::istream& stream);
void get_direction_data(angle_t elevation, angle_t azimuth, distance_t distance, uint32_t channel, DirectionData& ref_data) const override;
void get_direction_data(angle_t elevation, angle_t azimuth, distance_t distance, DirectionData& ref_data_left, DirectionData& ref_data_right) const override;
void sample_direction(angle_t elevation, angle_t azimuth, distance_t distance, uint32_t sample, uint32_t channel, float& value, float& delay) const override;
void sample_direction(angle_t elevation, angle_t azimuth, distance_t distance, uint32_t sample, float& value_left, float& delay_left, float& value_right, float& delay_right) const override;
uint32_t get_sample_rate() const override {
return m_sample_rate;
}
uint32_t get_response_length() const override {
return m_response_length;
}
uint32_t get_longest_delay() const override {
return m_longest_delay;
}
private:
uint32_t m_sample_rate;
uint32_t m_response_length;
uint32_t m_longest_delay;
uint32_t m_channel_count;
std::vector<DistanceData> m_distances;
};

View file

@ -1,14 +0,0 @@
#pragma once
#include <cstdint>
#include <vector>
typedef float distance_t;
typedef float angle_t;
typedef int delay_t;
struct DirectionData {
std::vector<float> impulse_response;
delay_t delay;
delay_t delay_right;
};

View file

@ -1,19 +0,0 @@
#pragma once
#include "HrtfTypes.h"
class IHrtfData {
public:
virtual ~IHrtfData() = default;
virtual void get_direction_data(angle_t elevation, angle_t azimuth, distance_t distance, uint32_t channel, DirectionData& ref_data) const = 0;
virtual void get_direction_data(angle_t elevation, angle_t azimuth, distance_t distance, DirectionData& ref_data_left, DirectionData& ref_data_right) const = 0;
// Get only once IR sample at given direction. The delay returned is the delay of IR's beginning, not the sample's!
virtual void sample_direction(angle_t elevation, angle_t azimuth, distance_t distance, uint32_t sample, uint32_t channel, float& value, float& delay) const = 0;
// Get only once IR sample at given direction for both channels. The delay returned is the delay of IR's beginning, not the sample's!
virtual void sample_direction(angle_t elevation, angle_t azimuth, distance_t distance, uint32_t sample, float& value_left, float& delay_left, float& value_right, float& delay_right) const = 0;
virtual uint32_t get_sample_rate() const = 0;
virtual uint32_t get_response_length() const = 0;
virtual uint32_t get_longest_delay() const = 0;
};

View file

@ -1,39 +0,0 @@
SoX resampler plugin for foobar2000 audio player
Copyright (C) lvqcl
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 2.1 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details.
You should have received a copy of the GNU Lesser General Public
License along with this library; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
This software uses code of SoX licensed under the terms of LGPLv2.1
Copyright (C) robs@users.sourceforge.net
Copyright (C) Chris Bagwell and SoX contributors
Copyright (C) Reuben Thomas
This software uses code of FFmpeg licensed under the terms of LGPLv2.1
Copyright (C) Fabrice Bellard
Copyright (C) Michael Niedermayer <michaelni@gmx.at>
Copyright (C) Alex Converse <alex dot converse at gmail dot com>
Copyright (C) Loren Merritt
Copyright (C) Vitor Sessak
Copyright (C) x264 project
This software uses code of General Purpose FFT Package
Copyright (C) Takuya OOURA
This software uses code of foobar2000 1.4 SDK
Copyright (C) 2001-2018, Peter Pawlowski

View file

@ -1,219 +0,0 @@
/*
* Copyright (c) 2013, 2018 lvqcl
*
* Permission to use, copy, modify, and distribute this software for any
* purpose with or without fee is hereby granted, provided that the above
* copyright notice and this permission notice appear in all copies.
*
* THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
* WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
* MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
* ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
* WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
* ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
* OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
*/
#include <memory.h>
#include <stdlib.h>
#include <stdbool.h>
#include "lpc.h"
static void apply_window(float *const data, const size_t data_len) {
#if 0
if (0) // subtract the mean
{
double mean = 0;
for(int i = 0; i < (int)data_len; i++)
mean += data[i];
mean /= data_len;
for(int i = 0; i < (int)data_len; i++)
data[i] -= (float)mean;
}
#endif
if(1) // Welch window
{
const float n2 = (data_len + 1) / 2.0f;
for(int i = 0; i < (int)data_len; i++) {
float k = (i + 1 - n2) / n2;
data[data_len - 1 - i] *= 1.0f - k * k;
}
}
}
static float vorbis_lpc_from_data(float *data, float *lpci, int n, int m, double *aut, double *lpc) {
double error;
double epsilon;
int i, j;
/* autocorrelation, p+1 lag coefficients */
j = m + 1;
while(j--) {
double d = 0; /* double needed for accumulator depth */
for(i = j; i < n; i++) d += (double)data[i] * data[i - j];
aut[j] = d;
}
/* Generate lpc coefficients from autocorr values */
/* set our noise floor to about -100dB */
error = aut[0] * (1. + 1e-10);
epsilon = 1e-9 * aut[0] + 1e-10;
for(i = 0; i < m; i++) {
double r = -aut[i + 1];
if(error < epsilon) {
memset(lpc + i, 0, (m - i) * sizeof(*lpc));
goto done;
}
/* Sum up this iteration's reflection coefficient; note that in
Vorbis we don't save it. If anyone wants to recycle this code
and needs reflection coefficients, save the results of 'r' from
each iteration. */
for(j = 0; j < i; j++) r -= lpc[j] * aut[i - j];
r /= error;
/* Update LPC coefficients and total error */
lpc[i] = r;
for(j = 0; j < i / 2; j++) {
double tmp = lpc[j];
lpc[j] += r * lpc[i - 1 - j];
lpc[i - 1 - j] += r * tmp;
}
if(i & 1) lpc[j] += lpc[j] * r;
error *= 1. - r * r;
}
done:
/* slightly damp the filter */
{
double g = .99;
double damp = g;
for(j = 0; j < m; j++) {
lpc[j] *= damp;
damp *= g;
}
}
for(j = 0; j < m; j++) lpci[j] = (float)lpc[j];
/* we need the error value to know how big an impulse to hit the
filter with later */
return error;
}
static void vorbis_lpc_predict(float *coeff, float *prime, int m, float *data, long n, float *work) {
/* in: coeff[0...m-1] LPC coefficients
prime[0...m-1] initial values (allocated size of n+m-1)
out: data[0...n-1] data samples */
long i, j, o, p;
float y;
if(!prime)
for(i = 0; i < m; i++)
work[i] = 0.f;
else
for(i = 0; i < m; i++)
work[i] = prime[i];
for(i = 0; i < n; i++) {
y = 0;
o = i;
p = m;
for(j = 0; j < m; j++)
y -= work[o++] * coeff[--p];
data[i] = work[o] = y;
}
}
void lpc_extrapolate2(float *const data, const size_t data_len, const int nch, const int lpc_order, const size_t extra_bkwd, const size_t extra_fwd, void **extrapolate_buffer, size_t *extrapolate_buffer_size) {
const size_t max_to_prime = (data_len < lpc_order) ? data_len : lpc_order;
const size_t min_data_len = (data_len < lpc_order) ? lpc_order : data_len;
const size_t tdata_size = sizeof(float) * (extra_bkwd + min_data_len + extra_fwd);
const size_t aut_size = sizeof(double) * (lpc_order + 1);
const size_t lpc_size = sizeof(double) * lpc_order;
const size_t lpci_size = sizeof(float) * lpc_order;
const size_t work_size = sizeof(float) * (extra_bkwd + lpc_order + extra_fwd);
const size_t new_size = tdata_size + aut_size + lpc_size + lpci_size + work_size;
if(new_size > *extrapolate_buffer_size) {
*extrapolate_buffer = realloc(*extrapolate_buffer, new_size);
*extrapolate_buffer_size = new_size;
}
double *aut = (double *)(*extrapolate_buffer);
double *lpc = (double *)(*extrapolate_buffer + aut_size);
float *tdata = (float *)(*extrapolate_buffer + aut_size + lpc_size); // for 1 channel only
float *lpci = (float *)(*extrapolate_buffer + aut_size + lpc_size + tdata_size);
float *work = (float *)(*extrapolate_buffer + aut_size + lpc_size + tdata_size + lpci_size);
for(int c = 0; c < nch; c++) {
if(extra_bkwd) {
for(int i = 0; i < (int)data_len; i++)
tdata[min_data_len - 1 - i] = data[i * nch + c];
if(data_len < min_data_len)
for(int i = (int)data_len; i < (int)min_data_len; i++)
tdata[min_data_len - 1 - i] = 0.0f;
} else {
const ssize_t len_diff = min_data_len - data_len;
if(len_diff <= 0) {
for(int i = 0; i < (int)data_len; i++)
tdata[i] = data[i * nch + c];
} else {
for(int i = 0; i < (int)len_diff; i++)
tdata[i] = 0.0f;
for(int i = 0; i < (int)data_len; i++)
tdata[len_diff + i] = data[i * nch + c];
}
}
apply_window(tdata, min_data_len);
vorbis_lpc_from_data(tdata, lpci, (int)min_data_len, lpc_order, aut, lpc);
// restore after apply_window
if(extra_bkwd) {
for(int i = 0; i < (int)data_len; i++)
tdata[min_data_len - 1 - i] = data[i * nch + c];
if(data_len < min_data_len)
for(int i = (int)data_len; i < (int)min_data_len; i++)
tdata[min_data_len - 1 - i] = 0.0f;
} else {
const ssize_t len_diff = min_data_len - data_len;
if(len_diff <= 0) {
for(int i = 0; i < (int)data_len; i++)
tdata[i] = data[i * nch + c];
} else {
for(int i = 0; i < (int)len_diff; i++)
tdata[i] = 0.0f;
for(int i = 0; i < (int)data_len; i++)
tdata[len_diff + i] = data[i * nch + c];
}
}
vorbis_lpc_predict(lpci, tdata + min_data_len - lpc_order, lpc_order, tdata + min_data_len, extra_fwd + extra_bkwd, work);
if(extra_bkwd) {
for(int i = 0; i < extra_bkwd; i++)
data[(-i - 1) * nch + c] = tdata[min_data_len + i];
} else {
for(int i = 0; i < extra_fwd; i++)
data[(i + data_len) * nch + c] = tdata[min_data_len + i];
}
}
}

View file

@ -1,40 +0,0 @@
#ifndef MY_LPC_H
#define MY_LPC_H
/* data - beginning of the data
* data_len - length of data (in samples) that are base for extrapolation
* nch - number of (interleaved) channels
* lpc_order - LPC order
* extra_bkwd - number of samples to pre-extrapolate
* extra_fwd - number of samples to post-extrapolate
*
* D = data; N = num_channels; LEN = data_len*N; EX = extra*N
*
* memory layout when invdir == false:
*
* [||||||||||||||||||||||||||||||||][||||||||||||||||||||][
* ^ D[0] ^ D[LEN] ^ D[LEN+EX]
*
* memory layout when invdir == true:
* ][||||||||||||||||||||][||||||||||||||||||||||||||||||||][
* ^ D[0] ^ D[LEN]
* ^ D[-1*N-EX] ^ D[-1*N]
*
*/
static const size_t LPC_ORDER = 32;
void lpc_extrapolate2(float * const data, const size_t data_len, const int nch, const int lpc_order, const size_t extra_bkwd, const size_t extra_fwd, void ** extrapolate_buffer, size_t * extrapolate_buffer_size);
static inline void lpc_extrapolate_bkwd(float * const data, const size_t data_len, const size_t prime_len, const int nch, const int lpc_order, const size_t extra_bkwd, void ** extrapolate_buffer, size_t * extrapolate_buffer_size)
{
(void)data_len;
lpc_extrapolate2(data, prime_len, nch, lpc_order, extra_bkwd, 0, extrapolate_buffer, extrapolate_buffer_size);
}
static inline void lpc_extrapolate_fwd(float * const data, const size_t data_len, const size_t prime_len, const int nch, const int lpc_order, const size_t extra_fwd, void ** extrapolate_buffer, size_t * extrapolate_buffer_size)
{
lpc_extrapolate2(data + (data_len - prime_len)*nch, prime_len, nch, lpc_order, 0, extra_fwd, extrapolate_buffer, extrapolate_buffer_size);
}
#endif

View file

@ -1,55 +0,0 @@
/* Copyright (c) 2018 lvqcl. All rights reserved.
*
* This library is free software; you can redistribute it and/or modify it
* under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation; either version 2.1 of the License, or (at
* your option) any later version.
*
* This library is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser
* General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with this library; if not, write to the Free Software Foundation,
* Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
*/
#ifndef UTIL_H_
#define UTIL_H_
#ifndef min
#define min(a,b) (((a)<(b))?(a):(b))
#endif
#ifndef max
#define max(a,b) (((a)>(b))?(a):(b))
#endif
static inline unsigned local_gcd(unsigned a, unsigned b)
{
if (a == 0 || b == 0) return 0;
unsigned c = a % b;
while (c != 0) { a = b; b = c; c = a % b; }
return b;
}
/*
In: *r1 and *r2: samplerates;
Out: *r1 and *r2: numbers of samples;
multiply r1 and r2 by n so that durations is 1/N th of second;
limit n so that r1 and r2 aren't bigger than M
*/
static void samples_len(unsigned* r1, unsigned* r2, unsigned N, unsigned M) // example: r1 = 44100, r2 = 48000, N = 20, M = 8192
{
if (r1 == 0 || r2 == 0) return;
unsigned v = local_gcd(*r1, *r2); // v = 300
*r1 /= v; *r2 /= v; // r1 = 147; r2 = 160 == 1/300th of second
unsigned n = (v + N-1) / N; // n = 300/20 = 15 times
unsigned z = max(*r1, *r2); // z = 160
if (z*n > M) n = M / z; // 160*15 = 2400 < 8192;; if M == 1024: n = 1024/160 = 6; 160*6 = 960
if (n < 1) n = 1;
*r1 *= n; *r2 *= n; // r1 = 147*15 = 2205 samples, r2 = 160*15 = 2400 samples
}
#endif

1
Audio/ThirdParty/r8brain-free-src vendored Submodule

@ -0,0 +1 @@
Subproject commit afd61e7ed76d86a9bc6cb91fd0a9f305f853fe38

31
Audio/ThirdParty/r8bstate.cpp vendored Normal file
View file

@ -0,0 +1,31 @@
//
// r8bstate.cpp
// CogAudio Framework
//
// Created by Christopher Snowhill on 6/24/22.
//
#include "r8bstate.h"
#include "r8bstate.hpp"
void *r8bstate_new(int channelCount, int bufferCapacity, double srcRate,
double dstRate) {
return (void *)new r8bstate(channelCount, bufferCapacity, srcRate, dstRate);
}
void r8bstate_delete(void *state) {
delete(r8bstate *)state;
}
double r8bstate_latency(void *state) {
return ((r8bstate *)state)->latency();
}
int r8bstate_resample(void *state, const float *input, size_t inCount, size_t *inDone,
float *output, size_t outMax) {
return ((r8bstate *)state)->resample(input, inCount, inDone, output, outMax);
}
int r8bstate_flush(void *state, float *output, size_t outMax) {
return ((r8bstate *)state)->flush(output, outMax);
}

33
Audio/ThirdParty/r8bstate.h vendored Normal file
View file

@ -0,0 +1,33 @@
//
// r8bstate.h
// CogAudio Framework
//
// Created by Christopher Snowhill on 6/24/22.
//
#include <stdint.h>
#include <stdlib.h>
#ifndef r8bstate_h
#define r8bstate_h
#ifdef __cplusplus
extern "C" {
#endif
void *r8bstate_new(int channelCount, int bufferCapacity, double srcRate,
double dstRate);
void r8bstate_delete(void *);
double r8bstate_latency(void *);
int r8bstate_resample(void *, const float *input, size_t inCount, size_t *inDone,
float *output, size_t outMax);
int r8bstate_flush(void *, float *output, size_t outMax);
#ifdef __cplusplus
}
#endif
#endif /* r8bstate_h */

165
Audio/ThirdParty/r8bstate.hpp vendored Normal file
View file

@ -0,0 +1,165 @@
//
// r8bstate.hpp
// CogAudio Framework
//
// Created by Christopher Snowhill on 3/3/22.
//
#ifndef r8bstate_hpp
#define r8bstate_hpp
#include <Accelerate/Accelerate.h>
#include "r8bbase.h"
#include "CDSPResampler.h"
struct r8bstate {
int channelCount;
int bufferCapacity;
size_t remainder;
uint64_t inProcessed;
uint64_t outProcessed;
double sampleRatio;
double dstRate;
r8b::CFixedBuffer<double> InBuf;
r8b::CFixedBuffer<double> *OutBufs;
r8b::CDSPResampler24 **Resamps;
r8bstate(int _channelCount, int _bufferCapacity, double srcRate, double _dstRate)
: channelCount(_channelCount), bufferCapacity(_bufferCapacity), inProcessed(0), outProcessed(0), remainder(0), dstRate(_dstRate) {
InBuf.alloc(bufferCapacity);
OutBufs = new r8b::CFixedBuffer<double>[channelCount];
Resamps = new r8b::CDSPResampler24 *[channelCount];
for(int i = 0; i < channelCount; ++i) {
Resamps[i] = new r8b::CDSPResampler24(srcRate, dstRate, bufferCapacity);
}
sampleRatio = dstRate / srcRate;
}
~r8bstate() {
delete[] OutBufs;
for(int i = 0; i < channelCount; ++i) {
delete Resamps[i];
}
delete[] Resamps;
}
double latency() {
return (((double)inProcessed * sampleRatio) - (double)outProcessed) / dstRate;
}
int resample(const float *input, size_t inCount, size_t *inDone, float *output, size_t outMax) {
int ret = 0;
int i;
if(inDone) *inDone = 0;
while(remainder > 0) {
size_t blockCount = remainder;
if(blockCount > outMax)
blockCount = outMax;
for(i = 0; i < channelCount; ++i) {
vDSP_vdpsp(&OutBufs[i][0], 1, output + i, channelCount, blockCount);
}
remainder -= blockCount;
if(remainder > 0) {
for(i = 0; i < channelCount; ++i) {
memmove(&OutBufs[i][0], &OutBufs[i][blockCount], remainder * sizeof(double));
}
}
output += channelCount * blockCount;
outProcessed += blockCount;
outMax -= blockCount;
ret += blockCount;
if(!outMax)
return ret;
}
while(inCount > 0) {
size_t blockCount = inCount;
if(blockCount > bufferCapacity)
blockCount = bufferCapacity;
int outputDone = 0;
for(i = 0; i < channelCount; ++i) {
double *outputPointer;
vDSP_vspdp(input + i, channelCount, &InBuf[0], 1, blockCount);
outputDone = Resamps[i]->process(InBuf, (int)blockCount, outputPointer);
if(outputDone) {
if(outputDone > outMax) {
vDSP_vdpsp(outputPointer, 1, output + i, channelCount, outMax);
remainder = outputDone - outMax;
OutBufs[i].alloc((int)remainder);
memcpy(&OutBufs[i][0], outputPointer + outMax, remainder);
} else {
vDSP_vdpsp(outputPointer, 1, output + i, channelCount, outputDone);
}
}
}
size_t outputActual = outputDone - remainder;
input += channelCount * blockCount;
output += channelCount * outputActual;
inCount -= blockCount;
if(inDone) *inDone += blockCount;
inProcessed += blockCount;
outProcessed += outputActual;
outMax -= outputActual;
ret += outputActual;
if(remainder)
break;
}
return ret;
}
int flush(float *output, size_t outMax) {
int ret = 0;
int i;
if(remainder > 0) {
size_t blockCount = remainder;
if(blockCount > outMax)
blockCount = outMax;
for(i = 0; i < channelCount; ++i) {
vDSP_vdpsp(&OutBufs[i][0], 1, output + i, channelCount, blockCount);
}
remainder -= blockCount;
if(remainder > 0) {
for(i = 0; i < channelCount; ++i) {
memmove(&OutBufs[i][0], &OutBufs[i][blockCount], remainder * sizeof(double));
}
}
output += channelCount * blockCount;
outProcessed += blockCount;
outMax -= blockCount;
ret += blockCount;
if(!outMax)
return ret;
}
uint64_t outputWanted = ceil(inProcessed * sampleRatio);
memset(&InBuf[0], 0, sizeof(double) * bufferCapacity);
while(outProcessed < outputWanted) {
int outputDone = 0;
for(int i = 0; i < channelCount; ++i) {
double *outputPointer;
outputDone = Resamps[i]->process(InBuf, bufferCapacity, outputPointer);
if(outputDone) {
if(outputDone > (outputWanted - outProcessed))
outputDone = (int)(outputWanted - outProcessed);
if(outputDone > outMax) {
vDSP_vdpsp(outputPointer, 1, output + i, channelCount, outMax);
remainder = outputDone - outMax;
OutBufs[i].alloc((int)remainder);
memcpy(&OutBufs[i][0], outputPointer + outMax, remainder);
} else {
vDSP_vdpsp(outputPointer, 1, output + i, channelCount, outputDone);
}
}
}
size_t outputActual = outputDone - remainder;
outProcessed += outputActual;
output += channelCount * outputActual;
outMax -= outputActual;
ret += outputActual;
if(remainder)
break;
}
return ret;
}
};
#endif /* r8bstate_h */

View file

@ -1,30 +0,0 @@
//
// rsstate.cpp
// CogAudio Framework
//
// Created by Christopher Snowhill on 2/4/23.
//
#include "rsstate.h"
#include "rsstate.hpp"
void *rsstate_new(int channelCount, double srcRate, double dstRate) {
return (void *)new rsstate(channelCount, srcRate, dstRate);
}
void rsstate_delete(void *state) {
delete(rsstate *)state;
}
double rsstate_latency(void *state) {
return ((rsstate *)state)->latency();
}
int rsstate_resample(void *state, const float *input, size_t inCount, size_t *inDone,
float *output, size_t outMax) {
return ((rsstate *)state)->resample(input, inCount, inDone, output, outMax);
}
int rsstate_flush(void *state, float *output, size_t outMax) {
return ((rsstate *)state)->flush(output, outMax);
}

View file

@ -1,32 +0,0 @@
//
// rsstate.h
// CogAudio Framework
//
// Created by Christopher Snowhill on 2/4/23.
//
#include <stdint.h>
#include <stdlib.h>
#ifndef rsstate_h
#define rsstate_h
#ifdef __cplusplus
extern "C" {
#endif
void *rsstate_new(int channelCount, double srcRate, double dstRate);
void rsstate_delete(void *);
double rsstate_latency(void *);
int rsstate_resample(void *, const float *input, size_t inCount, size_t *inDone,
float *output, size_t outMax);
int rsstate_flush(void *, float *output, size_t outMax);
#ifdef __cplusplus
}
#endif
#endif /* rsstate_h */

View file

@ -1,81 +0,0 @@
//
// rsstate.hpp
// CogAudio Framework
//
// Created by Christopher Snowhill on 2/3/23.
//
#ifndef rsstate_hpp
#define rsstate_hpp
#include "soxr.h"
#include <cmath>
#include <vector>
struct rsstate {
int channelCount;
int bufferCapacity;
size_t remainder;
uint64_t inProcessed;
uint64_t outProcessed;
double sampleRatio;
double dstRate;
std::vector<float> SilenceBuf;
soxr_t Resampler;
rsstate(int _channelCount, double srcRate, double _dstRate)
: channelCount(_channelCount), inProcessed(0), outProcessed(0), remainder(0), dstRate(_dstRate) {
SilenceBuf.resize(1024 * channelCount);
memset(&SilenceBuf[0], 0, 1024 * channelCount * sizeof(float));
Resampler = soxr_create(srcRate, dstRate, channelCount, NULL, NULL, NULL, NULL);
sampleRatio = dstRate / srcRate;
}
~rsstate() {
soxr_delete(Resampler);
}
double latency() {
return (((double)inProcessed * sampleRatio) - (double)outProcessed) / dstRate;
}
int resample(const float *input, size_t inCount, size_t *inDone, float *output, size_t outMax) {
size_t outDone = 0;
soxr_error_t errmsg = soxr_process(Resampler, (soxr_in_t)input, inCount, inDone, (soxr_out_t)output, outMax, &outDone);
if(!errmsg) {
inProcessed += *inDone;
outProcessed += outDone;
return (int)outDone;
} else {
return 0;
}
}
int flush(float *output, size_t outMax) {
size_t outTotal = 0;
uint64_t outputWanted = std::ceil(inProcessed * sampleRatio);
while(outProcessed < outputWanted) {
size_t outWanted = outputWanted - outProcessed;
if(outWanted > outMax) {
outWanted = outMax;
}
size_t outDone = 0;
size_t inDone = 0;
soxr_error_t errmsg = soxr_process(Resampler, (soxr_in_t)(&SilenceBuf[0]), 1024, &inDone, (soxr_out_t)output, outWanted, &outDone);
if(!errmsg) {
outProcessed += outDone;
outTotal += outDone;
output += outDone * channelCount;
outMax -= outDone;
if(!outMax || outProcessed == outputWanted) {
return (int)outTotal;
}
} else {
return 0;
}
}
return (int)outTotal;
}
};
#endif /* r8bstate_h */

View file

@ -1,5 +1,5 @@
//
// CogSemaphore.h
// Semaphore.h
// Cog
//
// Created by Vincent Spader on 8/2/05.

View file

@ -1,12 +1,12 @@
//
// CogSemaphore.m
// Semaphore.m
// Cog
//
// Created by Vincent Spader on 8/2/05.
// Copyright 2005 Vincent Spader. All rights reserved.
//
#import <CogAudio/CogSemaphore.h>
#import "Semaphore.h"
@implementation Semaphore

View file

@ -10,20 +10,20 @@
NS_ASSUME_NONNULL_BEGIN
@interface VisualizationController : NSObject {
double sampleRate;
double latency;
float *visAudio;
int visAudioCursor, visAudioSize;
}
+ (VisualizationController *)sharedController;
- (void)postLatency:(double)latency;
- (UInt64)samplesPosted;
- (void)postSampleRate:(double)sampleRate;
- (void)postVisPCM:(const float *)inPCM amount:(int)amount;
- (double)readSampleRate;
- (void)copyVisPCM:(float *_Nullable)outPCM visFFT:(float *_Nullable)outFFT latencyOffset:(double)latency;
- (void)reset;
- (void)copyVisPCM:(float *)outPCM visFFT:(float *)outFFT latencyOffset:(double)latency;
@end

View file

@ -10,13 +10,7 @@
#import "fft.h"
@implementation VisualizationController {
double sampleRate;
double latency;
float *visAudio;
int visAudioCursor, visAudioSize;
uint64_t visSamplesPosted;
}
@implementation VisualizationController
static VisualizationController *_sharedController = nil;
@ -33,7 +27,6 @@ static VisualizationController *_sharedController = nil;
self = [super init];
if(self) {
visAudio = NULL;
visAudioSize = 0;
latency = 0;
}
return self;
@ -43,17 +36,6 @@ static VisualizationController *_sharedController = nil;
fft_free();
}
- (void)reset {
@synchronized (self) {
latency = 0;
visAudioCursor = 0;
visSamplesPosted = 0;
if(visAudio && visAudioSize) {
bzero(visAudio, sizeof(float) * visAudioSize);
}
}
}
- (void)postSampleRate:(double)sampleRate {
@synchronized(self) {
if(self->sampleRate != sampleRate) {
@ -67,12 +49,6 @@ static VisualizationController *_sharedController = nil;
self->visAudio = visAudio;
self->visAudioSize = visAudioSize;
visAudioCursor %= visAudioSize;
} else {
if(self->visAudio) {
free(self->visAudio);
self->visAudio = NULL;
}
self->visAudioSize = 0;
}
}
}
@ -80,9 +56,6 @@ static VisualizationController *_sharedController = nil;
- (void)postVisPCM:(const float *)inPCM amount:(int)amount {
@synchronized(self) {
if(!visAudioSize) {
return;
}
int samplesRead = 0;
while(amount > 0) {
int amountToCopy = (int)(visAudioSize - visAudioCursor);
@ -92,7 +65,6 @@ static VisualizationController *_sharedController = nil;
if(visAudioCursor >= visAudioSize) visAudioCursor -= visAudioSize;
amount -= amountToCopy;
samplesRead += amountToCopy;
visSamplesPosted += amountToCopy;
}
}
}
@ -108,25 +80,11 @@ static VisualizationController *_sharedController = nil;
}
}
- (UInt64)samplesPosted {
return visSamplesPosted;
}
- (void)copyVisPCM:(float *_Nullable)outPCM visFFT:(float *_Nullable)outFFT latencyOffset:(double)latency {
- (void)copyVisPCM:(float *)outPCM visFFT:(float *)outFFT latencyOffset:(double)latency {
if(!outPCM && !outFFT) return;
if(!visAudio || !visAudioSize) {
if(outPCM) bzero(outPCM, sizeof(float) * 4096);
if(outFFT) bzero(outFFT, sizeof(float) * 2048);
return;
}
void *visAudioTemp = calloc(sizeof(float), 4096);
if(!visAudioTemp) {
if(outPCM) bzero(outPCM, sizeof(float) * 4096);
if(outFFT) bzero(outFFT, sizeof(float) * 2048);
return;
}
float tempPCM[4096];
if(!outPCM) outPCM = &tempPCM[0];
@synchronized(self) {
if(!sampleRate) {
@ -136,14 +94,11 @@ static VisualizationController *_sharedController = nil;
}
return;
}
int latencySamples = (int)(sampleRate * (self->latency + latency)) + 2048;
int latencySamples = (int)(sampleRate * (self->latency + latency));
if(latencySamples < 4096) latencySamples = 4096;
int readCursor = visAudioCursor - latencySamples;
int samples = 4096;
int samplesRead = 0;
if(latencySamples + samples > visAudioSize) {
samples = (int)(visAudioSize - latencySamples);
}
while(readCursor < 0)
readCursor += visAudioSize;
while(readCursor >= visAudioSize)
@ -151,21 +106,16 @@ static VisualizationController *_sharedController = nil;
while(samples > 0) {
int samplesToRead = (int)(visAudioSize - readCursor);
if(samplesToRead > samples) samplesToRead = samples;
cblas_scopy(samplesToRead, visAudio + readCursor, 1, visAudioTemp + samplesRead, 1);
cblas_scopy(samplesToRead, visAudio + readCursor, 1, outPCM + samplesRead, 1);
samplesRead += samplesToRead;
readCursor += samplesToRead;
samples -= samplesToRead;
if(readCursor >= visAudioSize) readCursor -= visAudioSize;
}
}
if(outPCM) {
cblas_scopy(4096, visAudioTemp, 1, outPCM, 1);
}
if(outFFT) {
fft_calculate(visAudioTemp, outFFT, 2048);
fft_calculate(outPCM, outFFT, 2048);
}
free(visAudioTemp);
}
@end

View file

@ -8,48 +8,27 @@
import Foundation
@objc(VisualizationController)
class VisualizationController : NSObject {
class VisualizationController {
var serialQueue = DispatchQueue(label: "Visualization Queue")
var sampleRate = 0.0
var sampleRate = 44100.0
var latency = 0.0
var visAudio: [Float] = Array(repeating: 0.0, count: 44100 * 45)
var visAudioCursor = 0
var visAudioSize = 0
var visSamplesPosted: UInt64 = 0
private static var sharedVisualizationController: VisualizationController = {
private static var sharedController: VisualizationController = {
let visualizationController = VisualizationController()
return visualizationController
}()
@objc
class func sharedController() -> VisualizationController {
return sharedVisualizationController
class func sharedVisualizationController() -> VisualizationController {
return sharedController
}
@objc
func reset() {
serialQueue.sync {
self.latency = 0;
let amount = self.visAudioSize
for i in 0..<amount {
self.visAudio[i] = 0
}
self.visSamplesPosted = 0;
}
}
@objc
func postLatency(_ latency: Double) {
self.latency = latency
}
@objc
func samplesPosted() -> UInt64 {
return self.visSamplesPosted
}
@objc
func postSampleRate(_ sampleRate: Double) {
serialQueue.sync {
if(self.sampleRate != sampleRate) {
@ -61,75 +40,57 @@ class VisualizationController : NSObject {
}
}
@objc
func postVisPCM(_ inPCM: UnsafePointer<Float>?, amount: Int) {
serialQueue.sync {
let bufferPointer = UnsafeBufferPointer<Float>(start: inPCM, count: amount)
let bufferPointer = UnsafeBufferPointer(start: inPCM, count: amount)
if let bptr = bufferPointer {
let dataArray = bptr.assumingMemoryBound(to: Float.self)
var j = self.visAudioCursor
let k = self.visAudioSize
if(j + amount <= k) {
let endIndex = j + amount;
self.visAudio.replaceSubrange(j..<endIndex, with: bufferPointer)
j += amount
if(j >= k) { j = 0 }
} else {
let inEndIndex = k - j
let remainder = amount - inEndIndex
self.visAudio.replaceSubrange(j..<k, with: bufferPointer.prefix(inEndIndex))
self.visAudio.replaceSubrange(0..<remainder, with: bufferPointer.suffix(remainder))
j = remainder
var k = self.visAudioSize
for i in 0..<amount {
let x = Float(dataArray[i])
self.visAudio[j] = x
j++; if j >= k { j = 0 }
}
self.visAudioCursor = j
self.visSamplesPosted += UInt64(amount);
}
}
}
@objc
func readSampleRate() -> Double {
serialQueue.sync {
return self.sampleRate
}
}
@objc
func copyVisPCM(_ outPCM: UnsafeMutablePointer<Float>?, visFFT: UnsafeMutablePointer<Float>?, latencyOffset: Double) {
if(self.visAudioSize == 0) {
outPCM?.update(repeating: 0.0, count: 4096)
visFFT?.update(repeating: 0.0, count: 2048)
return
}
var outPCMCopy = Array<Float>(repeating: 0.0, count: 4096)
func copyVisPCM(_ outPCM: UnsafeMutablePointer<Float>?, visFFT: UnsafeMutablePointer<Float>?, latencyoffset: Double) {
let outPCMCopy = Array<Float>(repeating: 0.0, count: 4096)
serialQueue.sync {
// Offset latency so the target sample is in the center of the window
let latencySamples = (Int)((self.latency + latencyOffset) * self.sampleRate) + 2048
var samplesToDo = 4096;
if(latencySamples < 0) {
return;
}
if(latencySamples < 4096) {
// Latency can sometimes dip below this threshold
samplesToDo = latencySamples;
}
var latencySamples = (Int)(self.latency * self.sampleRate)
var j = self.visAudioCursor - latencySamples
let k = self.visAudioSize
var k = self.visAudioSize
if j < 0 { j += k }
if(j + samplesToDo <= k) {
outPCMCopy.replaceSubrange(0..<samplesToDo, with: self.visAudio.suffix(from: j).prefix(samplesToDo))
} else {
let outEndIndex = k - j
let remainder = samplesToDo - outEndIndex
outPCMCopy.replaceSubrange(0..<outEndIndex, with: self.visAudio.suffix(from: j))
outPCMCopy.replaceSubrange(outEndIndex..<samplesToDo, with: self.visAudio.prefix(remainder))
for i in 0..4095 {
let x = self.visAudio[j]
outPCMCopy[i] = x
j++; if j >= k { j = 0 }
}
}
outPCM?.update(from: outPCMCopy, count: 4096)
let pcmPointer = UnsafeMutableBufferPointer(start: outPCM, count: 4096)
if let bptr = pcmPointer {
let dataArray = bptr.assumingMemoryBound(to: Float.self)
for i in 0..4095 {
let x = outPCMCopy[i]
dataArray[i] = x
}
}
if(visFFT != nil) {
let fftPointer = UnsafeMutablePointer(start: visFFT, count: 2048)
if let bptr = fftPointer {
serialQueue.sync {
fft_calculate(outPCMCopy, visFFT, 2048)
fft_calculate(outPCMCopy, bptr, 2048)
}
}
}

View file

@ -1,8 +1,8 @@
<?xml version="1.0" encoding="UTF-8"?>
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="20037" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES" customObjectInstantitationMethod="direct">
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="19529" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES" customObjectInstantitationMethod="direct">
<dependencies>
<deployment identifier="macosx"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="20037"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="19529"/>
<capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
</dependencies>
<objects>
@ -853,7 +853,7 @@
</connections>
</slider>
<textField horizontalHuggingPriority="251" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="sTm-FX-lSj">
<rect key="frame" x="0.0" y="9" width="701" height="16"/>
<rect key="frame" x="172" y="9" width="356" height="16"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" lineBreakMode="clipping" alignment="center" title="Note: You may use right-click to draw an equalizer shape" id="lwG-Tm-rr1">
<font key="font" usesAppearanceFont="YES"/>

View file

@ -1,62 +1,33 @@
<?xml version="1.0" encoding="UTF-8"?>
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="23504" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES" customObjectInstantitationMethod="direct">
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="15505" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES">
<dependencies>
<deployment identifier="macosx"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="23504"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="15505"/>
<capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
</dependencies>
<objects>
<customObject id="-2" userLabel="File's Owner" customClass="FeedbackController">
<connections>
<outlet property="emailView" destination="10" id="X4O-Qx-zUq"/>
<outlet property="messageView" destination="15" id="gwe-Bb-ZEz"/>
<outlet property="nameView" destination="5" id="9Uz-Hh-EVf"/>
<outlet property="fromView" destination="5" id="37"/>
<outlet property="messageView" destination="15" id="38"/>
<outlet property="sendingIndicator" destination="12" id="39"/>
<outlet property="subjectView" destination="10" id="36"/>
<outlet property="window" destination="1" id="42"/>
</connections>
</customObject>
<customObject id="-1" userLabel="First Responder" customClass="FirstResponder"/>
<customObject id="-3" userLabel="Application" customClass="NSObject"/>
<window title="Send Crash Feedback" allowsToolTipsWhenApplicationIsInactive="NO" autorecalculatesKeyViewLoop="NO" releasedWhenClosed="NO" visibleAtLaunch="NO" animationBehavior="default" id="1" userLabel="FeedbackWindow">
<window title="Send Feedback" allowsToolTipsWhenApplicationIsInactive="NO" autorecalculatesKeyViewLoop="NO" releasedWhenClosed="NO" visibleAtLaunch="NO" animationBehavior="default" id="1" userLabel="FeedbackWindow">
<windowStyleMask key="styleMask" titled="YES" closable="YES"/>
<windowPositionMask key="initialPositionMask" leftStrut="YES" rightStrut="YES" topStrut="YES" bottomStrut="YES"/>
<rect key="contentRect" x="168" y="357" width="480" height="376"/>
<rect key="screenRect" x="0.0" y="0.0" width="1920" height="1055"/>
<rect key="screenRect" x="0.0" y="0.0" width="1920" height="1057"/>
<value key="minSize" type="size" width="213" height="107"/>
<view key="contentView" id="3">
<rect key="frame" x="0.0" y="0.0" width="480" height="376"/>
<autoresizingMask key="autoresizingMask"/>
<subviews>
<textField focusRingType="none" verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="8" userLabel="Name:">
<rect key="frame" x="17" y="339" width="58" height="17"/>
<autoresizingMask key="autoresizingMask"/>
<textFieldCell key="cell" sendsActionOnEndEditing="YES" alignment="left" title="Name:" id="18" userLabel="Name:">
<font key="font" metaFont="system"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="5" userLabel="Name Field">
<rect key="frame" x="80" y="337" width="356" height="22"/>
<autoresizingMask key="autoresizingMask"/>
<textFieldCell key="cell" scrollable="YES" lineBreakMode="clipping" selectable="YES" editable="YES" sendsActionOnEndEditing="YES" state="on" borderStyle="bezel" drawsBackground="YES" id="21">
<font key="font" metaFont="system"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<connections>
<outlet property="nextKeyView" destination="10" id="CNG-sG-ab3"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="4">
<rect key="frame" x="17" y="297" width="58" height="17"/>
<autoresizingMask key="autoresizingMask"/>
<textFieldCell key="cell" sendsActionOnEndEditing="YES" title="Email:" id="22">
<font key="font" metaFont="system"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="10" userLabel="Email Field">
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="10">
<rect key="frame" x="80" y="295" width="356" height="22"/>
<autoresizingMask key="autoresizingMask"/>
<textFieldCell key="cell" scrollable="YES" lineBreakMode="clipping" selectable="YES" editable="YES" sendsActionOnEndEditing="YES" state="on" borderStyle="bezel" alignment="left" drawsBackground="YES" id="16">
@ -65,13 +36,22 @@
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<connections>
<outlet property="nextKeyView" destination="15" id="Xi7-6Y-bG1"/>
<outlet property="nextKeyView" destination="15" id="30"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="7">
<rect key="frame" x="17" y="262" width="272" height="17"/>
<textField verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="8">
<rect key="frame" x="17" y="297" width="58" height="17"/>
<autoresizingMask key="autoresizingMask"/>
<textFieldCell key="cell" sendsActionOnEndEditing="YES" alignment="left" title="Describe what you were doing:" id="19">
<textFieldCell key="cell" sendsActionOnEndEditing="YES" alignment="left" title="Subject:" id="18">
<font key="font" metaFont="system"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="7">
<rect key="frame" x="17" y="262" width="66" height="17"/>
<autoresizingMask key="autoresizingMask"/>
<textFieldCell key="cell" sendsActionOnEndEditing="YES" alignment="left" title="Message:" id="19">
<font key="font" metaFont="system"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
@ -80,16 +60,16 @@
<scrollView fixedFrame="YES" horizontalLineScroll="10" horizontalPageScroll="10" verticalLineScroll="10" verticalPageScroll="10" hasHorizontalScroller="NO" usesPredominantAxisScrolling="NO" translatesAutoresizingMaskIntoConstraints="NO" id="11">
<rect key="frame" x="20" y="55" width="440" height="199"/>
<autoresizingMask key="autoresizingMask"/>
<clipView key="contentView" drawsBackground="NO" id="tK9-bv-5OD">
<rect key="frame" x="1" y="1" width="423" height="197"/>
<clipView key="contentView" ambiguous="YES" drawsBackground="NO" id="tK9-bv-5OD">
<rect key="frame" x="1" y="1" width="438" height="197"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<textView importsGraphics="NO" verticallyResizable="YES" usesFontPanel="YES" findStyle="panel" continuousSpellChecking="YES" usesRuler="YES" smartInsertDelete="YES" id="15">
<rect key="frame" x="0.0" y="0.0" width="423" height="197"/>
<textView ambiguous="YES" importsGraphics="NO" verticallyResizable="YES" usesFontPanel="YES" findStyle="panel" continuousSpellChecking="YES" usesRuler="YES" smartInsertDelete="YES" id="15">
<rect key="frame" x="0.0" y="0.0" width="438" height="197"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
<size key="minSize" width="423" height="197"/>
<size key="minSize" width="438" height="197"/>
<size key="maxSize" width="863" height="10000000"/>
<color key="insertionPointColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
</textView>
@ -100,7 +80,7 @@
<autoresizingMask key="autoresizingMask"/>
</scroller>
<scroller key="verticalScroller" wantsLayer="YES" verticalHuggingPriority="750" horizontal="NO" id="14">
<rect key="frame" x="424" y="1" width="15" height="197"/>
<rect key="frame" x="423" y="1" width="16" height="197"/>
<autoresizingMask key="autoresizingMask"/>
</scroller>
<connections>
@ -131,6 +111,31 @@
<outlet property="nextKeyView" destination="5" id="28"/>
</connections>
</button>
<progressIndicator horizontalHuggingPriority="750" verticalHuggingPriority="750" fixedFrame="YES" maxValue="100" bezeled="NO" indeterminate="YES" controlSize="small" style="spinning" translatesAutoresizingMaskIntoConstraints="NO" id="12">
<rect key="frame" x="444" y="340" width="16" height="16"/>
<autoresizingMask key="autoresizingMask"/>
</progressIndicator>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="5">
<rect key="frame" x="80" y="337" width="356" height="22"/>
<autoresizingMask key="autoresizingMask"/>
<textFieldCell key="cell" scrollable="YES" lineBreakMode="clipping" selectable="YES" editable="YES" sendsActionOnEndEditing="YES" state="on" borderStyle="bezel" drawsBackground="YES" id="21">
<font key="font" metaFont="system"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<connections>
<outlet property="nextKeyView" destination="10" id="33"/>
</connections>
</textField>
<textField verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="4">
<rect key="frame" x="17" y="339" width="71" height="17"/>
<autoresizingMask key="autoresizingMask"/>
<textFieldCell key="cell" sendsActionOnEndEditing="YES" title="Email:" id="22">
<font key="font" metaFont="system"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
</subviews>
</view>
<connections>

View file

@ -1,168 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="22113.1" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES" customObjectInstantitationMethod="direct">
<dependencies>
<deployment identifier="macosx"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="22113.1"/>
<capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
</dependencies>
<objects>
<customObject id="-2" userLabel="File's Owner" customClass="FileTreeViewController">
<connections>
<outlet property="fileTreeOutlineView" destination="69" id="141"/>
<outlet property="firstResponder" destination="69" id="140"/>
<outlet property="view" destination="55" id="103"/>
</connections>
</customObject>
<customObject id="-1" userLabel="First Responder" customClass="FirstResponder"/>
<customObject id="-3" userLabel="Application" customClass="NSObject"/>
<customObject id="9" userLabel="FileTreeDataSource" customClass="FileTreeDataSource">
<connections>
<outlet property="outlineView" destination="69" id="88"/>
<outlet property="pathControl" destination="65" id="109"/>
<outlet property="watcher" destination="31" id="34"/>
</connections>
</customObject>
<userDefaultsController representsSharedInstance="YES" id="27"/>
<customObject id="31" customClass="PathWatcher">
<connections>
<outlet property="delegate" destination="9" id="33"/>
</connections>
</customObject>
<customView id="55" userLabel="File Tree View">
<rect key="frame" x="0.0" y="0.0" width="300" height="400"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<pathControl focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" allowsExpansionToolTips="YES" translatesAutoresizingMaskIntoConstraints="NO" id="65">
<rect key="frame" x="76" y="374" width="224" height="26"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<pathCell key="cell" selectable="YES" enabled="NO" focusRingType="none" alignment="left" pathStyle="popUp" id="66">
<font key="font" metaFont="system"/>
<color key="backgroundColor" name="windowBackgroundColor" catalog="System" colorSpace="catalog"/>
</pathCell>
<connections>
<binding destination="27" name="value" keyPath="values.fileTreeRootURL" id="108">
<dictionary key="options">
<string key="NSValueTransformerName">StringToURLTransformer</string>
</dictionary>
</binding>
</connections>
</pathControl>
<box fixedFrame="YES" boxType="custom" borderType="line" title="Box" titlePosition="noTitle" translatesAutoresizingMaskIntoConstraints="NO" id="147">
<rect key="frame" x="0.0" y="373" width="300" height="1"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<view key="contentView" id="Dg2-ay-LZH">
<rect key="frame" x="1" y="1" width="298" height="0.0"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
</view>
<color key="borderColor" name="scrollBarColor" catalog="System" colorSpace="catalog"/>
<color key="fillColor" name="textColor" catalog="System" colorSpace="catalog"/>
<font key="titleFont" metaFont="system"/>
</box>
<scrollView fixedFrame="YES" borderType="none" autohidesScrollers="YES" horizontalLineScroll="23" horizontalPageScroll="10" verticalLineScroll="23" verticalPageScroll="10" usesPredominantAxisScrolling="NO" translatesAutoresizingMaskIntoConstraints="NO" id="64">
<rect key="frame" x="0.0" y="0.0" width="300" height="373"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<clipView key="contentView" drawsBackground="NO" copiesOnScroll="NO" id="OYe-Aa-Spw">
<rect key="frame" x="0.0" y="0.0" width="300" height="373"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<outlineView focusRingType="none" verticalHuggingPriority="750" allowsExpansionToolTips="YES" columnAutoresizingStyle="lastColumnOnly" selectionHighlightStyle="sourceList" columnReordering="NO" autosaveColumns="NO" autosaveName="FileTree" rowHeight="18" indentationPerLevel="14" autoresizesOutlineColumn="YES" outlineTableColumn="70" id="69" customClass="FileTreeOutlineView">
<rect key="frame" x="0.0" y="0.0" width="300" height="373"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<size key="intercellSpacing" width="3" height="5"/>
<color key="backgroundColor" name="_sourceListBackgroundColor" catalog="System" colorSpace="catalog"/>
<color key="gridColor" name="gridColor" catalog="System" colorSpace="catalog"/>
<tableColumns>
<tableColumn editable="NO" width="268" minWidth="16" maxWidth="1000" id="70">
<tableHeaderCell key="headerCell" lineBreakMode="truncatingTail" borderStyle="border" alignment="left">
<color key="textColor" name="headerTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" white="0.33333299" alpha="1" colorSpace="calibratedWhite"/>
</tableHeaderCell>
<textFieldCell key="dataCell" lineBreakMode="truncatingTail" selectable="YES" editable="YES" alignment="left" title="Text Cell" id="71" customClass="FileIconCell">
<font key="font" metaFont="system"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlBackgroundColor" catalog="System" colorSpace="catalog"/>
<connections>
<binding destination="27" name="fontSize" keyPath="values.fontSize" id="93"/>
</connections>
</textFieldCell>
<tableColumnResizingMask key="resizingMask" resizeWithTable="YES" userResizable="YES"/>
</tableColumn>
</tableColumns>
<connections>
<binding destination="27" name="rowHeight" keyPath="values.fontSize" id="86">
<dictionary key="options">
<string key="NSValueTransformerName">FontSizetoLineHeightTransformer</string>
</dictionary>
</binding>
<outlet property="dataSource" destination="9" id="87"/>
<outlet property="delegate" destination="94" id="98"/>
<outlet property="menu" destination="110" id="121"/>
</connections>
</outlineView>
</subviews>
<nil key="backgroundColor"/>
</clipView>
<scroller key="horizontalScroller" hidden="YES" verticalHuggingPriority="750" horizontal="YES" id="67">
<rect key="frame" x="0.0" y="362" width="306" height="15"/>
<autoresizingMask key="autoresizingMask"/>
</scroller>
<scroller key="verticalScroller" hidden="YES" verticalHuggingPriority="750" horizontal="NO" id="68">
<rect key="frame" x="261" y="0.0" width="15" height="363"/>
<autoresizingMask key="autoresizingMask"/>
</scroller>
<connections>
<outlet property="nextKeyView" destination="69" id="104"/>
</connections>
</scrollView>
<button verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="Tqu-Wl-fBM">
<rect key="frame" x="-2" y="370" width="81" height="32"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<buttonCell key="cell" type="push" title="Choose" bezelStyle="rounded" alignment="center" borderStyle="border" imageScaling="proportionallyDown" inset="2" id="vRe-3U-Nxj">
<behavior key="behavior" pushIn="YES" lightByBackground="YES" lightByGray="YES"/>
<font key="font" metaFont="system"/>
</buttonCell>
<connections>
<action selector="chooseRootFolder:" target="-2" id="aZ6-tK-wqz"/>
</connections>
</button>
</subviews>
<point key="canvasLocation" x="-310" y="119"/>
</customView>
<customObject id="94" customClass="FileTreeController">
<connections>
<outlet property="controller" destination="-2" id="106"/>
<outlet property="dataSource" destination="9" id="137"/>
<outlet property="outlineView" destination="69" id="95"/>
</connections>
</customObject>
<menu title="Menu" id="110" userLabel="ContextualMenu">
<items>
<menuItem title="Add to Playlist" tag="1" id="119">
<connections>
<action selector="addToPlaylist:" target="94" id="122"/>
</connections>
</menuItem>
<menuItem title="Set as Playlist" tag="2" id="129">
<connections>
<action selector="setAsPlaylist:" target="94" id="130"/>
</connections>
</menuItem>
<menuItem isSeparatorItem="YES" id="128"/>
<menuItem title="Show in Finder" tag="3" id="112">
<connections>
<action selector="showEntryInFinder:" target="94" id="123"/>
</connections>
</menuItem>
<menuItem isSeparatorItem="YES" id="126"/>
<menuItem title="Set as Root" tag="4" id="124">
<connections>
<action selector="setAsRoot:" target="94" id="125"/>
</connections>
</menuItem>
</items>
<connections>
<outlet property="delegate" destination="69" id="139"/>
</connections>
</menu>
</objects>
</document>

View file

@ -1,8 +1,8 @@
<?xml version="1.0" encoding="UTF-8"?>
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="23504" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES" customObjectInstantitationMethod="direct">
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="20037" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES">
<dependencies>
<deployment identifier="macosx"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="23504"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="20037"/>
<capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
</dependencies>
<objects>
@ -15,37 +15,16 @@
<customObject id="-3" userLabel="Application" customClass="NSObject"/>
<window title="Info Inspector" separatorStyle="none" allowsToolTipsWhenApplicationIsInactive="NO" autorecalculatesKeyViewLoop="NO" hidesOnDeactivate="YES" visibleAtLaunch="NO" animationBehavior="utilityWindow" frameAutosaveName="InfoInspector" titlebarAppearsTransparent="YES" id="1" customClass="NSPanel">
<windowStyleMask key="styleMask" titled="YES" closable="YES" resizable="YES" utility="YES" nonactivatingPanel="YES" HUD="YES"/>
<rect key="contentRect" x="700" y="80" width="300" height="604"/>
<rect key="contentRect" x="700" y="80" width="300" height="582"/>
<rect key="screenRect" x="0.0" y="0.0" width="1920" height="1055"/>
<value key="minSize" type="size" width="240" height="594"/>
<value key="maxSize" type="size" width="400" height="622"/>
<value key="minSize" type="size" width="240" height="550"/>
<value key="maxSize" type="size" width="400" height="600"/>
<view key="contentView" id="2">
<rect key="frame" x="0.0" y="0.0" width="300" height="604"/>
<rect key="frame" x="0.0" y="0.0" width="300" height="582"/>
<autoresizingMask key="autoresizingMask"/>
<subviews>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="vB6-9J-5qg">
<rect key="frame" x="-2" y="592" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Album Artist:" id="LFJ-QQ-gGr">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="cj0-Tw-xpq" customClass="ToolTipTextField">
<rect key="frame" x="113" y="592" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="B8w-o8-ZBw">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<connections>
<binding destination="-2" name="value" keyPath="valueToDisplay.albumartist" id="gTS-bf-rHT"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="9">
<rect key="frame" x="-2" y="570" width="107" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="9">
<rect key="frame" x="60" y="526" width="45" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Artist:" id="10">
<font key="font" metaFont="smallSystem"/>
@ -53,8 +32,98 @@
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="33" customClass="ToolTipTextField">
<rect key="frame" x="113" y="570" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="11">
<rect key="frame" x="62" y="504" width="43" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Album:" id="12">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="13">
<rect key="frame" x="67" y="460" width="38" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Track:" id="14">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="15">
<rect key="frame" x="60" y="438" width="45" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Length:" id="16">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="17">
<rect key="frame" x="73" y="416" width="32" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Year:" id="18">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="19">
<rect key="frame" x="65" y="394" width="40" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Genre:" id="20">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="21">
<rect key="frame" x="32" y="350" width="73" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Sample Rate:" id="22">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="27">
<rect key="frame" x="48" y="328" width="57" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Channels:" id="28">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="29">
<rect key="frame" x="63" y="306" width="42" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Bitrate:" id="32">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="30">
<rect key="frame" x="15" y="284" width="90" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Bits Per Sample:" id="31">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="23">
<rect key="frame" x="73" y="482" width="32" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Title:" id="24">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="33" customClass="ToolTipTextField">
<rect key="frame" x="113" y="526" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="34">
<font key="font" metaFont="smallSystem"/>
@ -65,38 +134,8 @@
<binding destination="-2" name="value" keyPath="valueToDisplay.artist" id="108"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="fDr-Nh-3YI">
<rect key="frame" x="-2" y="548" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Composer:" id="z22-KP-d7B">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="WVx-cb-45q" customClass="ToolTipTextField">
<rect key="frame" x="113" y="548" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="AUa-jh-Azn">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<connections>
<binding destination="-2" name="value" keyPath="valueToDisplay.composer" id="AKm-26-c5F"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="11">
<rect key="frame" x="-2" y="526" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Album:" id="12">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="35" customClass="ToolTipTextField">
<rect key="frame" x="113" y="526" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="35" customClass="ToolTipTextField">
<rect key="frame" x="113" y="504" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="36">
<font key="font" metaFont="smallSystem"/>
@ -107,17 +146,8 @@
<binding destination="-2" name="value" keyPath="valueToDisplay.album" id="109"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="23">
<rect key="frame" x="-2" y="504" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Title:" id="24">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="37" customClass="ToolTipTextField">
<rect key="frame" x="113" y="504" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="37" customClass="ToolTipTextField">
<rect key="frame" x="113" y="482" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="38">
<font key="font" metaFont="smallSystem"/>
@ -128,17 +158,8 @@
<binding destination="-2" name="value" keyPath="valueToDisplay.title" id="110"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="13">
<rect key="frame" x="-2" y="482" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Track:" id="14">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="39" customClass="ToolTipTextField">
<rect key="frame" x="113" y="482" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="39" customClass="ToolTipTextField">
<rect key="frame" x="113" y="460" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="40">
<font key="font" metaFont="smallSystem"/>
@ -149,17 +170,8 @@
<binding destination="-2" name="value" keyPath="valueToDisplay.trackText" id="ZO2-Cd-dfh"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="15">
<rect key="frame" x="-2" y="460" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Length:" id="16">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="41" customClass="ToolTipTextField">
<rect key="frame" x="113" y="460" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="41" customClass="ToolTipTextField">
<rect key="frame" x="113" y="438" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="42">
<font key="font" metaFont="smallSystem"/>
@ -167,21 +179,11 @@
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<connections>
<binding destination="-2" name="toolTip" keyPath="valueToDisplay.lengthInfo" id="26q-iJ-ecn"/>
<binding destination="-2" name="value" keyPath="valueToDisplay.lengthText" id="ji7-tL-8rb"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="17">
<rect key="frame" x="-2" y="438" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Date:" id="18">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="43" customClass="ToolTipTextField">
<rect key="frame" x="113" y="438" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="43" customClass="ToolTipTextField">
<rect key="frame" x="113" y="416" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="44">
<font key="font" metaFont="smallSystem"/>
@ -189,20 +191,11 @@
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<connections>
<binding destination="-2" name="value" keyPath="valueToDisplay.date" id="EUq-4E-Lz3"/>
<binding destination="-2" name="value" keyPath="valueToDisplay.yearText" id="miZ-gp-CqU"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="19">
<rect key="frame" x="-2" y="416" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Genre:" id="20">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="45" customClass="ToolTipTextField">
<rect key="frame" x="113" y="416" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="45" customClass="ToolTipTextField">
<rect key="frame" x="113" y="394" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="46">
<font key="font" metaFont="smallSystem"/>
@ -213,38 +206,8 @@
<binding destination="-2" name="value" keyPath="valueToDisplay.genre" id="114"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="84">
<rect key="frame" x="-2" y="394" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Filename:" id="85">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="86" customClass="ToolTipTextField">
<rect key="frame" x="113" y="394" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="87">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<connections>
<binding destination="-2" name="value" keyPath="valueToDisplay.filename" id="115"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="21">
<rect key="frame" x="-2" y="372" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Sample Rate:" id="22">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="49" customClass="ToolTipTextField">
<rect key="frame" x="113" y="372" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="49" customClass="ToolTipTextField">
<rect key="frame" x="113" y="350" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="50">
<font key="font" metaFont="smallSystem"/>
@ -259,17 +222,8 @@
</binding>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="27">
<rect key="frame" x="-2" y="350" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Channels:" id="28">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="51" customClass="ToolTipTextField">
<rect key="frame" x="113" y="350" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="51" customClass="ToolTipTextField">
<rect key="frame" x="113" y="328" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="52">
<font key="font" metaFont="smallSystem"/>
@ -280,17 +234,8 @@
<binding destination="-2" name="value" keyPath="valueToDisplay.channels" id="117"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="29">
<rect key="frame" x="-2" y="328" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Bitrate:" id="32">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="53" customClass="ToolTipTextField">
<rect key="frame" x="113" y="328" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="53" customClass="ToolTipTextField">
<rect key="frame" x="113" y="306" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="54">
<font key="font" metaFont="smallSystem"/>
@ -301,17 +246,8 @@
<binding destination="-2" name="value" keyPath="valueToDisplay.bitrate" id="118"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="30">
<rect key="frame" x="-2" y="306" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Bits Per Sample:" id="31">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="55" customClass="ToolTipTextField">
<rect key="frame" x="113" y="306" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="55" customClass="ToolTipTextField">
<rect key="frame" x="113" y="284" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="56">
<font key="font" metaFont="smallSystem"/>
@ -322,8 +258,8 @@
<binding destination="-2" name="value" keyPath="valueToDisplay.bitsPerSample" id="122"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="QPg-Mb-Urn">
<rect key="frame" x="-2" y="284" width="107" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="QPg-Mb-Urn">
<rect key="frame" x="60" y="262" width="45" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Codec:" id="cbq-TT-CZX">
<font key="font" metaFont="smallSystem"/>
@ -331,8 +267,8 @@
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="ijS-y2-eCZ" customClass="ToolTipTextField">
<rect key="frame" x="113" y="284" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="ijS-y2-eCZ" customClass="ToolTipTextField">
<rect key="frame" x="113" y="262" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="Yby-OU-cqP">
<font key="font" metaFont="smallSystem"/>
@ -343,8 +279,8 @@
<binding destination="-2" name="value" keyPath="valueToDisplay.codec" id="Tle-Vx-BN5"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="bti-s6-SIU">
<rect key="frame" x="-2" y="262" width="107" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="bti-s6-SIU">
<rect key="frame" x="38" y="240" width="67" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Encoding:" id="8e7-lp-K5l">
<font key="font" metaFont="smallSystem"/>
@ -352,8 +288,8 @@
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="L4f-rE-CN3" customClass="ToolTipTextField">
<rect key="frame" x="113" y="262" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="L4f-rE-CN3" customClass="ToolTipTextField">
<rect key="frame" x="113" y="240" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="v14-AG-B9D">
<font key="font" metaFont="smallSystem"/>
@ -364,8 +300,8 @@
<binding destination="-2" name="value" keyPath="valueToDisplay.encoding" id="BWi-9r-yOR"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="gn9-9b-eV8">
<rect key="frame" x="-2" y="240" width="107" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="gn9-9b-eV8">
<rect key="frame" x="15" y="218" width="90" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Cuesheet:" id="Cu5-ia-Z5b">
<font key="font" metaFont="smallSystem"/>
@ -373,8 +309,8 @@
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="WOl-SC-4tu" customClass="ToolTipTextField">
<rect key="frame" x="113" y="240" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="WOl-SC-4tu" customClass="ToolTipTextField">
<rect key="frame" x="113" y="218" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="UyE-Mc-e39">
<font key="font" metaFont="smallSystem"/>
@ -385,8 +321,8 @@
<binding destination="-2" name="value" keyPath="valueToDisplay.cuesheetPresent" id="nPu-MP-igV"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="arA-Pj-ANg">
<rect key="frame" x="-2" y="218" width="107" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="arA-Pj-ANg">
<rect key="frame" x="15" y="196" width="90" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="ReplayGain:" id="qBi-M8-kEx">
<font key="font" metaFont="smallSystem"/>
@ -394,8 +330,8 @@
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="2xx-It-i6I" customClass="ToolTipTextField">
<rect key="frame" x="113" y="218" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="2xx-It-i6I" customClass="ToolTipTextField">
<rect key="frame" x="113" y="196" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="rAo-AP-lVa">
<font key="font" metaFont="smallSystem"/>
@ -407,8 +343,50 @@
<binding destination="-2" name="toolTip" keyPath="valueToDisplay.gainInfo" id="BUp-Hu-Szq"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="FWx-Sc-Ocv">
<rect key="frame" x="-2" y="196" width="107" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="84">
<rect key="frame" x="49" y="372" width="56" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Filename:" id="85">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="86" customClass="ToolTipTextField">
<rect key="frame" x="113" y="372" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="87">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<connections>
<binding destination="-2" name="value" keyPath="valueToDisplay.filename" id="115"/>
</connections>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="vB6-9J-5qg">
<rect key="frame" x="18" y="548" width="87" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Album Artist:" id="LFJ-QQ-gGr">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="cj0-Tw-xpq" customClass="ToolTipTextField">
<rect key="frame" x="113" y="548" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="B8w-o8-ZBw">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<connections>
<binding destination="-2" name="value" keyPath="valueToDisplay.albumartist" id="gTS-bf-rHT"/>
</connections>
</textField>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="FWx-Sc-Ocv">
<rect key="frame" x="15" y="174" width="90" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Play Count:" id="fiv-eh-w3c">
<font key="font" metaFont="smallSystem"/>
@ -416,8 +394,8 @@
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="WSs-wC-mWc" customClass="ToolTipTextField">
<rect key="frame" x="113" y="196" width="170" height="14"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="WSs-wC-mWc" customClass="ToolTipTextField">
<rect key="frame" x="113" y="174" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="Ial-XI-91y">
<font key="font" metaFont="smallSystem"/>
@ -429,28 +407,6 @@
<binding destination="-2" name="toolTip" keyPath="valueToDisplay.playCountInfo" id="ydF-ec-fBX"/>
</connections>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="cd3-Qt-hCm">
<rect key="frame" x="-2" y="174" width="107" height="14"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" scrollable="YES" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Comment:" id="Ule-N3-dKW">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="Ef3-yG-qT1" customClass="ToolTipTextField">
<rect key="frame" x="113" y="174" width="170" height="14"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<textFieldCell key="cell" controlSize="small" lineBreakMode="truncatingMiddle" selectable="YES" sendsActionOnEndEditing="YES" title="N/A" id="PPV-dt-9Bp">
<font key="font" metaFont="smallSystem"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<connections>
<binding destination="-2" name="value" keyPath="valueToDisplay.comment" id="Esa-PB-mpv"/>
<binding destination="-2" name="toolTip" keyPath="valueToDisplay.comment" id="XzQ-nd-jMU"/>
</connections>
</textField>
<imageView horizontalHuggingPriority="251" verticalHuggingPriority="251" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="RWn-fb-0wT">
<rect key="frame" x="0.0" y="20" width="300" height="146"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMaxX="YES" flexibleMinY="YES" heightSizable="YES"/>

View file

@ -1,66 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="22113.1" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES" customObjectInstantitationMethod="direct">
<dependencies>
<deployment identifier="macosx"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="22113.1"/>
<capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
</dependencies>
<objects>
<customObject id="-2" userLabel="File's Owner" customClass="LyricsWindowController">
<connections>
<outlet property="window" destination="QvC-M9-y7g" id="eSJ-Nv-PBE"/>
</connections>
</customObject>
<customObject id="-1" userLabel="First Responder" customClass="FirstResponder"/>
<customObject id="-3" userLabel="Application" customClass="NSObject"/>
<window title="Lyrics" allowsToolTipsWhenApplicationIsInactive="NO" autorecalculatesKeyViewLoop="NO" visibleAtLaunch="NO" frameAutosaveName="LyricsWindow" animationBehavior="default" id="QvC-M9-y7g">
<windowStyleMask key="styleMask" titled="YES" closable="YES" miniaturizable="YES" resizable="YES"/>
<windowPositionMask key="initialPositionMask" leftStrut="YES" rightStrut="YES" topStrut="YES" bottomStrut="YES"/>
<rect key="contentRect" x="196" y="240" width="480" height="270"/>
<rect key="screenRect" x="0.0" y="0.0" width="1920" height="1055"/>
<view key="contentView" wantsLayer="YES" id="EiT-Mj-1SZ">
<rect key="frame" x="0.0" y="0.0" width="480" height="270"/>
<autoresizingMask key="autoresizingMask"/>
<subviews>
<scrollView borderType="none" horizontalLineScroll="10" horizontalPageScroll="10" verticalLineScroll="10" verticalPageScroll="10" hasHorizontalScroller="NO" id="O8B-8Z-Mxc">
<rect key="frame" x="0.0" y="0.0" width="480" height="270"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<clipView key="contentView" drawsBackground="NO" id="O6S-QV-ThM">
<rect key="frame" x="0.0" y="0.0" width="480" height="270"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<textView wantsLayer="YES" editable="NO" importsGraphics="NO" richText="NO" verticallyResizable="YES" findStyle="bar" incrementalSearchingEnabled="YES" id="DKA-ld-0Sh">
<rect key="frame" x="0.0" y="0.0" width="480" height="270"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<color key="textColor" name="textColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
<size key="minSize" width="480" height="270"/>
<size key="maxSize" width="480" height="10000000"/>
<color key="insertionPointColor" name="textColor" catalog="System" colorSpace="catalog"/>
<connections>
<binding destination="-2" name="value" keyPath="valueToDisplay.unsyncedlyrics" id="tSj-CA-G4Q">
<dictionary key="options">
<bool key="NSAllowsEditingMultipleValuesSelection" value="NO"/>
<bool key="NSConditionallySetsEditable" value="NO"/>
</dictionary>
</binding>
</connections>
</textView>
</subviews>
</clipView>
<scroller key="horizontalScroller" hidden="YES" verticalHuggingPriority="750" horizontal="YES" id="f77-wI-xOz">
<rect key="frame" x="-100" y="-100" width="225" height="15"/>
<autoresizingMask key="autoresizingMask"/>
</scroller>
<scroller key="verticalScroller" verticalHuggingPriority="750" horizontal="NO" id="XfW-du-B6L">
<rect key="frame" x="464" y="0.0" width="16" height="270"/>
<autoresizingMask key="autoresizingMask"/>
</scroller>
</scrollView>
</subviews>
</view>
<point key="canvasLocation" x="126" y="104"/>
</window>
<userDefaultsController representsSharedInstance="YES" id="t5R-DO-d90"/>
</objects>
</document>

File diff suppressed because it is too large Load diff

View file

@ -1,8 +1,8 @@
<?xml version="1.0" encoding="UTF-8"?>
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="22154" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES" customObjectInstantitationMethod="direct">
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="19529" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES">
<dependencies>
<deployment identifier="macosx"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="22154"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="19529"/>
<capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
</dependencies>
<objects>
@ -24,7 +24,7 @@
<rect key="frame" x="0.0" y="0.0" width="506" height="100"/>
<autoresizingMask key="autoresizingMask"/>
<subviews>
<button tag="1" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="8">
<button verticalHuggingPriority="750" fixedFrame="YES" tag="1" translatesAutoresizingMaskIntoConstraints="NO" id="8">
<rect key="frame" x="408" y="12" width="84" height="32"/>
<autoresizingMask key="autoresizingMask"/>
<buttonCell key="cell" type="push" title="OK" bezelStyle="rounded" alignment="center" borderStyle="border" tag="1" inset="2" id="23">
@ -52,7 +52,7 @@ Gw
<action selector="doOpenURL:" target="-2" id="15"/>
</connections>
</button>
<comboBox focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="13">
<comboBox verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="13">
<rect key="frame" x="20" y="56" width="469" height="26"/>
<autoresizingMask key="autoresizingMask"/>
<comboBoxCell key="cell" scrollable="YES" lineBreakMode="clipping" selectable="YES" editable="YES" borderStyle="bezel" alignment="left" drawsBackground="YES" numberOfVisibleItems="5" id="25">

View file

@ -1,8 +1,8 @@
<?xml version="1.0" encoding="UTF-8"?>
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="22113.1" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES" customObjectInstantitationMethod="direct">
<document type="com.apple.InterfaceBuilder3.Cocoa.XIB" version="3.0" toolsVersion="20037" targetRuntime="MacOSX.Cocoa" propertyAccessControl="none" useAutolayout="YES">
<dependencies>
<deployment identifier="macosx"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="22113.1"/>
<plugIn identifier="com.apple.InterfaceBuilder.CocoaPlugin" version="20037"/>
<capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
</dependencies>
<objects>
@ -45,17 +45,17 @@ DQ
<rect key="frame" x="20" y="44" width="440" height="228"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<clipView key="contentView" drawsBackground="NO" copiesOnScroll="NO" id="zfU-bI-FkO">
<rect key="frame" x="1" y="1" width="438" height="226"/>
<rect key="frame" x="1" y="1" width="423" height="226"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<tableView focusRingType="none" verticalHuggingPriority="750" allowsExpansionToolTips="YES" alternatingRowBackgroundColors="YES" autosaveName="CogSpotlightPlaylist" rowSizeStyle="automatic" headerView="25" viewBased="YES" id="28" customClass="PlaylistView">
<rect key="frame" x="0.0" y="0.0" width="438" height="203"/>
<tableView focusRingType="none" verticalHuggingPriority="750" allowsExpansionToolTips="YES" alternatingRowBackgroundColors="YES" autosaveName="CogSpotlightPlaylist" headerView="25" id="28" customClass="PlaylistView">
<rect key="frame" x="0.0" y="0.0" width="447" height="203"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<size key="intercellSpacing" width="3" height="2"/>
<color key="backgroundColor" name="controlBackgroundColor" catalog="System" colorSpace="catalog"/>
<color key="gridColor" name="gridColor" catalog="System" colorSpace="catalog"/>
<tableColumns>
<tableColumn identifier="title" editable="NO" width="126" minWidth="41" maxWidth="1000" id="36">
<tableColumn identifier="title" editable="NO" width="129" minWidth="41" maxWidth="1000" id="36">
<tableHeaderCell key="headerCell" lineBreakMode="truncatingTail" borderStyle="border" alignment="left" title="Title">
<color key="textColor" name="headerTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" white="0.33333299" alpha="1" colorSpace="calibratedWhite"/>
@ -67,26 +67,6 @@ DQ
</textFieldCell>
<sortDescriptor key="sortDescriptorPrototype" selector="caseInsensitiveCompare:" sortKey="title"/>
<tableColumnResizingMask key="resizingMask" resizeWithTable="YES" userResizable="YES"/>
<prototypeCellViews>
<tableCellView id="vIL-tT-nsx">
<rect key="frame" x="1" y="1" width="131" height="17"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<textField focusRingType="none" horizontalHuggingPriority="251" verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="dow-05-P5d">
<rect key="frame" x="0.0" y="1" width="131" height="16"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES" flexibleMaxY="YES"/>
<textFieldCell key="cell" lineBreakMode="truncatingTail" sendsActionOnEndEditing="YES" title="Table View Cell" id="nbX-Qx-ta8">
<font key="font" usesAppearanceFont="YES"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
</subviews>
<connections>
<outlet property="textField" destination="dow-05-P5d" id="dKF-cf-DXu"/>
</connections>
</tableCellView>
</prototypeCellViews>
<connections>
<binding destination="16" name="value" keyPath="arrangedObjects.title" id="93">
<dictionary key="options">
@ -96,7 +76,7 @@ DQ
<binding destination="186" name="fontSize" keyPath="values.fontSize" id="198"/>
</connections>
</tableColumn>
<tableColumn identifier="artist" editable="NO" width="122" minWidth="36" maxWidth="1000" id="34">
<tableColumn identifier="artist" editable="NO" width="124" minWidth="36" maxWidth="1000" id="34">
<tableHeaderCell key="headerCell" lineBreakMode="truncatingTail" borderStyle="border" alignment="left" title="Artist">
<color key="textColor" name="headerTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="headerColor" catalog="System" colorSpace="catalog"/>
@ -108,26 +88,6 @@ DQ
</textFieldCell>
<sortDescriptor key="sortDescriptorPrototype" selector="caseInsensitiveCompare:" sortKey="artist"/>
<tableColumnResizingMask key="resizingMask" resizeWithTable="YES" userResizable="YES"/>
<prototypeCellViews>
<tableCellView id="VrH-Fp-XeF">
<rect key="frame" x="135" y="1" width="122" height="17"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<textField focusRingType="none" horizontalHuggingPriority="251" verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="f5E-6Y-uFf">
<rect key="frame" x="0.0" y="1" width="122" height="16"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES" flexibleMaxY="YES"/>
<textFieldCell key="cell" lineBreakMode="truncatingTail" sendsActionOnEndEditing="YES" title="Table View Cell" id="zeT-Bx-Fer">
<font key="font" usesAppearanceFont="YES"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
</subviews>
<connections>
<outlet property="textField" destination="f5E-6Y-uFf" id="V2x-bq-F1X"/>
</connections>
</tableCellView>
</prototypeCellViews>
<connections>
<binding destination="16" name="value" keyPath="arrangedObjects.artist" id="104">
<dictionary key="options">
@ -137,7 +97,7 @@ DQ
<binding destination="186" name="fontSize" keyPath="values.fontSize" id="199"/>
</connections>
</tableColumn>
<tableColumn identifier="album" editable="NO" width="125" minWidth="39" maxWidth="1000" id="33">
<tableColumn identifier="album" editable="NO" width="127" minWidth="39" maxWidth="1000" id="33">
<tableHeaderCell key="headerCell" lineBreakMode="truncatingTail" borderStyle="border" alignment="left" title="Album">
<color key="textColor" name="headerTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="headerColor" catalog="System" colorSpace="catalog"/>
@ -149,26 +109,6 @@ DQ
</textFieldCell>
<sortDescriptor key="sortDescriptorPrototype" selector="caseInsensitiveCompare:" sortKey="album"/>
<tableColumnResizingMask key="resizingMask" resizeWithTable="YES" userResizable="YES"/>
<prototypeCellViews>
<tableCellView id="sZe-dt-0bF">
<rect key="frame" x="260" y="1" width="125" height="17"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<textField focusRingType="none" horizontalHuggingPriority="251" verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="wXR-zH-kaq">
<rect key="frame" x="0.0" y="1" width="125" height="16"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES" flexibleMaxY="YES"/>
<textFieldCell key="cell" lineBreakMode="truncatingTail" sendsActionOnEndEditing="YES" title="Table View Cell" id="DBh-Jn-CLe">
<font key="font" usesAppearanceFont="YES"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
</subviews>
<connections>
<outlet property="textField" destination="wXR-zH-kaq" id="zTm-AX-xhO"/>
</connections>
</tableCellView>
</prototypeCellViews>
<connections>
<binding destination="16" name="value" keyPath="arrangedObjects.album" id="101">
<dictionary key="options">
@ -189,28 +129,8 @@ DQ
<color key="backgroundColor" name="controlBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<tableColumnResizingMask key="resizingMask" resizeWithTable="YES" userResizable="YES"/>
<prototypeCellViews>
<tableCellView id="w6e-lJ-LQP">
<rect key="frame" x="1" y="1" width="0.0" height="17"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<textField focusRingType="none" horizontalHuggingPriority="251" verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="kNC-PB-Kja">
<rect key="frame" x="0.0" y="1" width="4" height="16"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES" flexibleMaxY="YES"/>
<textFieldCell key="cell" lineBreakMode="truncatingTail" sendsActionOnEndEditing="YES" title="Table View Cell" id="Uvp-HF-CkO">
<font key="font" usesAppearanceFont="YES"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
</subviews>
<connections>
<outlet property="textField" destination="kNC-PB-Kja" id="qWQ-5p-Fw9"/>
</connections>
</tableCellView>
</prototypeCellViews>
<connections>
<binding destination="16" name="value" keyPath="arrangedObjects.length" id="4eP-Is-O9p">
<binding destination="16" name="value" keyPath="arrangedObjects.spotlightLength" id="gWF-nL-fqJ">
<dictionary key="options">
<bool key="NSConditionallySetsEditable" value="YES"/>
</dictionary>
@ -229,26 +149,6 @@ DQ
<color key="backgroundColor" name="controlBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
<tableColumnResizingMask key="resizingMask" resizeWithTable="YES" userResizable="YES"/>
<prototypeCellViews>
<tableCellView id="1lW-Vx-WF3">
<rect key="frame" x="1" y="1" width="0.0" height="17"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<textField focusRingType="none" horizontalHuggingPriority="251" verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="acc-7f-dXh">
<rect key="frame" x="0.0" y="1" width="4" height="16"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES" flexibleMaxY="YES"/>
<textFieldCell key="cell" lineBreakMode="truncatingTail" sendsActionOnEndEditing="YES" title="Table View Cell" id="9ZK-JX-P74">
<font key="font" usesAppearanceFont="YES"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
</subviews>
<connections>
<outlet property="textField" destination="acc-7f-dXh" id="Hha-9Z-fS4"/>
</connections>
</tableCellView>
</prototypeCellViews>
<connections>
<binding destination="16" name="value" keyPath="arrangedObjects.year" id="94">
<dictionary key="options">
@ -270,26 +170,6 @@ DQ
</textFieldCell>
<sortDescriptor key="sortDescriptorPrototype" selector="caseInsensitiveCompare:" sortKey="genre"/>
<tableColumnResizingMask key="resizingMask" resizeWithTable="YES" userResizable="YES"/>
<prototypeCellViews>
<tableCellView id="HJP-mb-r2M">
<rect key="frame" x="1" y="1" width="0.0" height="17"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<textField focusRingType="none" horizontalHuggingPriority="251" verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="5n5-AC-puO">
<rect key="frame" x="0.0" y="1" width="4" height="16"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES" flexibleMaxY="YES"/>
<textFieldCell key="cell" lineBreakMode="truncatingTail" sendsActionOnEndEditing="YES" title="Table View Cell" id="R66-n6-DSb">
<font key="font" usesAppearanceFont="YES"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
</subviews>
<connections>
<outlet property="textField" destination="5n5-AC-puO" id="Wno-mK-MDi"/>
</connections>
</tableCellView>
</prototypeCellViews>
<connections>
<binding destination="16" name="value" keyPath="arrangedObjects.genre" id="102">
<dictionary key="options">
@ -299,7 +179,7 @@ DQ
<binding destination="186" name="fontSize" keyPath="values.fontSize" id="212"/>
</connections>
</tableColumn>
<tableColumn identifier="track" editable="NO" width="44" minWidth="8" maxWidth="46" id="29">
<tableColumn identifier="track" editable="NO" width="46" minWidth="8" maxWidth="46" id="29">
<tableHeaderCell key="headerCell" lineBreakMode="truncatingTail" borderStyle="border" alignment="right" title="Track">
<color key="textColor" name="headerTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="headerColor" catalog="System" colorSpace="catalog"/>
@ -311,26 +191,6 @@ DQ
</textFieldCell>
<sortDescriptor key="sortDescriptorPrototype" selector="compareTrackNumbers:" sortKey="trackText"/>
<tableColumnResizingMask key="resizingMask" resizeWithTable="YES" userResizable="YES"/>
<prototypeCellViews>
<tableCellView id="AdY-0L-7wP">
<rect key="frame" x="388" y="1" width="48" height="17"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<textField focusRingType="none" horizontalHuggingPriority="251" verticalHuggingPriority="750" horizontalCompressionResistancePriority="250" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="6ak-Iu-0Dr">
<rect key="frame" x="0.0" y="1" width="48" height="16"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES" flexibleMaxY="YES"/>
<textFieldCell key="cell" lineBreakMode="truncatingTail" sendsActionOnEndEditing="YES" title="Table View Cell" id="5fO-3i-lEM">
<font key="font" usesAppearanceFont="YES"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="textBackgroundColor" catalog="System" colorSpace="catalog"/>
</textFieldCell>
</textField>
</subviews>
<connections>
<outlet property="textField" destination="6ak-Iu-0Dr" id="Afl-RA-rk1"/>
</connections>
</tableCellView>
</prototypeCellViews>
<connections>
<binding destination="16" name="value" keyPath="arrangedObjects.trackText" id="VFy-Rw-QP2">
<dictionary key="options">
@ -349,7 +209,6 @@ DQ
</dictionary>
</binding>
<outlet property="dataSource" destination="16" id="151"/>
<outlet property="delegate" destination="16" id="LPj-YJ-Alq"/>
<outlet property="menu" destination="171" id="176"/>
<outlet property="playlistController" destination="16" id="184"/>
</connections>
@ -362,18 +221,18 @@ DQ
<autoresizingMask key="autoresizingMask"/>
</scroller>
<scroller key="verticalScroller" verticalHuggingPriority="750" doubleValue="1" horizontal="NO" id="27">
<rect key="frame" x="423" y="24" width="16" height="203"/>
<rect key="frame" x="424" y="24" width="15" height="203"/>
<autoresizingMask key="autoresizingMask"/>
</scroller>
<tableHeaderView key="headerView" wantsLayer="YES" id="25">
<rect key="frame" x="0.0" y="0.0" width="438" height="23"/>
<rect key="frame" x="0.0" y="0.0" width="447" height="23"/>
<autoresizingMask key="autoresizingMask"/>
</tableHeaderView>
<connections>
<outlet property="nextKeyView" destination="88" id="206"/>
</connections>
</scrollView>
<searchField wantsLayer="YES" focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="55">
<searchField wantsLayer="YES" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="55">
<rect key="frame" x="20" y="282" width="313" height="22"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" flexibleMinY="YES"/>
<searchFieldCell key="cell" scrollable="YES" lineBreakMode="clipping" selectable="YES" editable="YES" borderStyle="bezel" bezelStyle="round" id="56">
@ -386,10 +245,10 @@ DQ
<outlet property="nextKeyView" destination="5" id="204"/>
</connections>
</searchField>
<textField focusRingType="none" verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="79">
<rect key="frame" x="18" y="13" width="264" height="17"/>
<textField verticalHuggingPriority="750" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="79">
<rect key="frame" x="174" y="13" width="108" height="17"/>
<autoresizingMask key="autoresizingMask" flexibleMinX="YES" flexibleMaxY="YES"/>
<textFieldCell key="cell" lineBreakMode="clipping" sendsActionOnEndEditing="YES" alignment="right" title="Search Location:" id="80">
<textFieldCell key="cell" lineBreakMode="clipping" sendsActionOnEndEditing="YES" title="Search Location:" id="80">
<font key="font" metaFont="system"/>
<color key="textColor" name="controlTextColor" catalog="System" colorSpace="catalog"/>
<color key="backgroundColor" name="controlColor" catalog="System" colorSpace="catalog"/>

View file

@ -2,10 +2,13 @@
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.temporary-exception.mach-lookup.global-name</key>
<array>
<string>$(PRODUCT_BUNDLE_IDENTIFIER)-spks</string>
<string>$(PRODUCT_BUNDLE_IDENTIFIER)-spki</string>
</array>
<key>com.apple.security.app-sandbox</key>
<true/>
<key>com.apple.security.cs.disable-library-validation</key>
<true/>
<key>com.apple.security.cs.allow-jit</key>
<true/>
<key>com.apple.security.files.user-selected.read-write</key>

View file

@ -0,0 +1,36 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>HPDBookType</key>
<string>3</string>
<key>HPDBookTitle</key>
<string>Cog Help</string>
<key>HPDBookKBURL</key>
<string>https://bitbucket.org/kode54/cog</string>
<key>HPDBookKBProduct</key>
<string>cog1</string>
<key>HPDBookIndexPath</key>
<string>Cog.helpindex</string>
<key>HPDBookIconPath</key>
<string>shrd/icon.png</string>
<key>HPDBookAccessPath</key>
<string>Cog.html</string>
<key>CFBundleVersion</key>
<string>1</string>
<key>CFBundleSignature</key>
<string>hbwr</string>
<key>CFBundleShortVersionString</key>
<string>1</string>
<key>CFBundlePackageType</key>
<string>BNDL</string>
<key>CFBundleName</key>
<string>Cog</string>
<key>CFBundleInfoDictionaryVersion</key>
<string>6.0</string>
<key>CFBundleIdentifier</key>
<string>org.cogx.cog.help</string>
<key>CFBundleDevelopmentRegion</key>
<string>en_us</string>
</dict>
</plist>

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show more