This is a question which comes from a programmers perspective (so might be SO, please migrate then) but with an EE background.
I am trying to programmatically reduce the bluetooth power consumption between a cellphone (Blackberry Z10 featuring probably a Texas Instruments WL1273L Wireless Module) and a Pebble smartwatch (Pebble probably featuring Texas Instruments CC2564: http://www.kynix.com/Detail/729375/CC2564.html).
Apple provides some best practices where they suggest (see last chapter) to “Disconnect from a Device When You No Longer Need It”. But still BT would be enabled and draw power.
Now on the other hand, the connection phase seems quite power intensive, so the question arises if keeping a connection (without sending data) is better than disconnecting and reconnecting if data is available. are the underlying layers similar to this TCP scenario?
Is there a sweet spot depending on the rate of new data?
I am using regular BT, but how would BLE compare, is it worth porting (if possible).
Unfortunately i dont have the equipment to measure the power consumption.
Also, there is no lowlevel hardware access to optimize or high level access on the BT device (cant turn BT on or off), so the options are:
connect to device and keep connection open
connect to device, deliver data, disconnect, repeat
The delay of having to reconnect is accepted.
I do minimize the amount of discoveries so on reconnect i try to connect directly to the device. What other considerations could be taken?