Arturia Forums

DRUMS => Spark => Spark Technical Issues => Topic started by: Lovesound on May 06, 2013, 05:43:47 pm

Title: Spark LE - Using two instances (or more) in Logic ?!
Post by: Lovesound on May 06, 2013, 05:43:47 pm
Hi there,
still learning to use Spark LE in my Logic/ Studio Setup.
I need to run two (or more) instances of Spark (with the One Controller!) in Logic, but the Spark Controller seems to be somehow internally wired and is triggering ALL the time both instances, even if the track is not selected in the Logic Arrange Window. (so f.e. if I select Omnisphere, the pads are triggering still the Spark tracks AND the Omnisphere track !?) I do have Spark LE and have created an "Spark1 Midi Out" and disconnected the Spark LE Port (see Picture of the Logic Environment). The Spark LE Pads needs to send Midi into Logic, since I do record my Pattern in Logic (not Spark!), if I connected the other port I always got the "9-16 pads triggering also 1-8 pads" issue. It also didn't helped to connect it to the Midi Click (like suggested in a different threat), since I need to connect it to the Sequencer Input to send Midi, and to record the Pattern.

Can I somehow setup Spark LE to be only triggering the Spark Software, when it is selected in the arrange window (as any other controller) ?

I hope I could make myself clear...

Regards,
Sebastian
Title: Re: Spark LE - Using two instances (or more) in Logic ?!
Post by: Kevin on May 14, 2013, 11:28:32 am
Hi Sebastian,
I did not see your post because it was in the wrong board.
Did you manage to solve your problem?

Kevin
Title: Re: Spark LE - Using two instances (or more) in Logic ?!
Post by: Lovesound on May 14, 2013, 07:22:52 pm
Hi Kevin,
Unfortunately not. I really would appreciate if you can give me some advise. I noticed if you load several instances, new midi ports are created, but still midi1, the first instance, is always triggered, even it is not selected? Really strange.

Regards,
Sebastian
Title: Re: Spark LE - Using two instances (or more) in Logic ?!
Post by: Kevin on May 15, 2013, 08:15:29 am
The only way to do that is to manually disconnect Spark controller from Spark plugin instance with the Connect/Disconnect button.