API compile and flash included files not found

Hello,
I have three files locally that I am trying to flash to the core using curl. The example.ino file and then a library trexController.h and trexController.cpp. I have them in the same directory and call the following in a linux termial.

curl -X PUT -F file=@example.ino 'https://api.particle.io/v1/devices/deviceidhere?access_token=accesstokenhere

It connects to spark cloud and i get a failed response like this…

{
  "ok": false,
  "errors": [
    {
      "ok": false,
      "output": "Compiler timed out or encountered an error",
      "stdout": "Building core-common-lib\nmake[1]: Nothing to be done for `all'.\n\nBuilding core-communication-lib\nmake[1]: Nothing to be done for `all'.\n\nBuilding user file: ../197fb341661612a1970cb6387cfaadbb0ab2360cd6764f886acc39650beb/example.cpp\narm-none-eabi-g++ -DUSE_STDPERIPH_DRIVER -DSTM32F10X_MD -DDFU_BUILD_ENABLE -DSPARK -I\"../../core-common-lib/CMSIS/Include\" -I\"../../core-common-lib/CMSIS/Device/ST/STM32F10x/Include\" -I\"../../core-common-lib/STM32F10x_StdPeriph_Driver/inc\" -I\"../../core-common-lib/STM32_USB-FS-Device_Driver/inc\" -I\"../../core-common-lib/CC3000_Host_Driver\" -I\"../../core-common-lib/SPARK_Firmware_Driver/inc\" -I\"../../core-common-lib/SPARK_Services/inc\" -I\"../libraries\" -I\"../../core-communication-lib/lib/tropicssl/include\" -I\"../../core-communication-lib/src\" -I\"../inc\" -Os -ffunction-sections -Wall -std=gnu++0x -fno-exceptions -fno-rtti -c -fmessage-length=0 -MMD -MP -MF\"../197fb341661612a1970cb6387cfaadbb0ab2360cd6764f886acc39650beb/example.d\" -MT\"../197fb341661612a1970cb6387cfaadbb0ab2360cd6764f886acc39650beb/example.d\" -mcpu=cortex-m3 -mthumb -g3 -gdwarf-2 -o \"../197fb341661612a1970cb6387cfaadbb0ab2360cd6764f886acc39650beb/example.o\" \"../197fb341661612a1970cb6387cfaadbb0ab2360cd6764f886acc39650beb/example.cpp\"\n",
      "errors": [
        "build didn't produce binary Error: Command failed: In file included from ../inc/spark_wiring.h:29:0,\n                 from ../inc/application.h:29,\n                 from ../197fb341661612a1970cb6387cfaadbb0ab2360cd6764f886acc39650beb/example.cpp:2:\n../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning \"Defaulting to Release Build\" [-Wcpp]\n #warning  \"Defaulting to Release Build\"\n  ^\n../197fb341661612a1970cb6387cfaadbb0ab2360cd6764f886acc39650beb/example.cpp:3:43: fatal error: trexController/trexController.h: No such file or directory\n void setup();\n                                           ^\ncompilation terminated.\nmake: *** [../197fb341661612a1970cb6387cfaadbb0ab2360cd6764f886acc39650beb/example.o] Error 1\n",
        "In file included from ../inc/spark_wiring.h:29:0,\n                 from ../inc/application.h:29,\n                 from example.cpp:2:\n../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning \"Defaulting to Release Build\" [-Wcpp]\n #warning  \"Defaulting to Release Build\"\n  ^\nexample.cpp:3:43: **fatal error: trexController.h: No such file or directory**\n void setup();\n                                           ^\ncompilation terminated.\nmake: *** [example.o] Error 1\n",
        {
          "killed": false,
          "code": 2,
          "signal": null
        }
      ]
    }
  ]
}

I see at the bottom it says 'fatal error: trexController.h: No such file or directory'. So does this mean that it cannot find the included files? In example.ino I do include the .h file at the top. It works when there is only one file like example.ino with no other libraries included. I dont understand why it cant find those files for the library.

You’ll have to upload the other included files (*.h, *.cpp, etc) that you reference in your ino file. I’m not sure of what the cURL syntax is for uploading multiple files in a single PUT off the top of my head. Is it possible for you to use particle-cli instead of plain cURL? If not, I’ll see what I can’t do to hack out a solution using cURL!

3 Likes

Thanks for the quick response. I have not tried doing this with particle-cli but looking at it now, I think it will work better than the cURL for my purpose! I will switch over to node.js and try it. Thanks!

The method is to be one directory above and use particle compile dir_name