That’s an interesting comment - so I did some research …
There are three factors that affect how much flash is available for the user application.
Total Flash on the platform, bootloader size, and base firmware size.
Amount of flash on platform (from Arduino IDE boards.txt & Spark docs)
32K Arduino UNO, Pro, Mini (Atmega328)
256K Arduino Mega 2560 (Atmega2560)
32K Arduino Pro Micro (Atmel32U4)
128K Spark (ST32M)
128K RFduino (nRF51822)
Size of the bootloader (from Arduino IDE boards.txt & Spark docs)
0 Arduino UNO
2K Arduino Pro Mini
4K Arduino Mega 2560
4K Arduio Pro Micro
Size of an empty setup() and loop() application (rough estimate of base firmware)
466 bytes Arduino UNO, PRO, Mini, Micro
466 bytes Arduino Mega 2560
73K bytes Spark
466 bytes RFduino
Flash available for user application (available flash - bootloader - base firmware)
31.5K Arduino UNO
29.5K Arduino Pro Mini
251.5K Arduino Mega 2560
27.5K Adruino Pro Micro
So the Spark has slightly more flash available to the user application than several of the Arduino platforms, but significantly less than Arduino Mega and RFduino.
Spark has a lot of potential (SRAM and Processing speed) however with so much code overhead in the support of the cloud it falls into the smaller category of platforms by reducing the flash available for the user application.
SRAM for comparison
2K Arduino UNO, PRO Mini
2.5K Arduino Pro M
8K Arduino MEGA
Spark takes the lead with available SRAM, which is probably why it can support a TCP/IP stack.
I should point out that because of the base firmware on the Spark the user application could consume less firmware because of leveraging any library function already included in the base firmware - in otherwords the user application may take up less space in flash on the Spark vs other platforms due to the presence of library routines already loaded in support of the Spark protocol.
I am hopeful that the hardware abstraction re-factor going may provide an opportunity to reduce the base firmware size.