Spark Core Http Client Library

Heya @ryotsuke,

The firmware release was delayed on our end, so the current build IDE isn’t using the newer ram optimized version yet. It looks like this library allocates a 1K buffer, and a TCP Client object, so when building on the IDE that would use about 1.5K of ram I think. If you made the buffer a little smaller that would probably help a lot for the moment. We’re close to marking another stable release of the firmware ( this week ), so the build IDE version will be updated when that’s ready.

Thanks!
David

So free memory in current WEB ide is less 1.5K? That veeeery little
How much is available after RAM optimizations?

I think after the first round of optimizations it’s at least 4K, and we’re going to work very hard to make sure it never goes below that again. :slight_smile:

@ryotsuke Thanks for adding support for IP addresses, that was much needed. I haven’t had time to take a close look yet, but I might prefer that we keep it as one parameter, “request.host” or something, and then automatically recognize whether it’s a url or an ip.

I’m travelling at the moment, but I brought my Spark so I think I should be able to take a closer look later this week.

My knowledge of C++ is too basic to do better :smile:

I’m no whiz so don’t have too high expectations, but I’ll give it a go. :smile:

1 Like

First, thank you nmattlsson for creating the library and ryotsuke for your work adding the IP address and useful posts.

I am trying to use the library in the web IDE (only been playing with the spark for a day so I have not had time to try the netbeans thing). I think i am close to getting it work but it is not compiling.

I added the application.h, HttpClient.cpp and HttpClient.h as additional files in the web IDE.

One thing i noticed is that the http.post method seems to have the order of request and response members reversed from the get method. is this correct?

Does anyone have a working POST example?

Thanks in advance:

Here is my application code:

#include "application.h"
#include "HttpClient.h"

#include <string.h>


/**
* Declaring the variables.
*/
unsigned int nextTime = 0;    // Next time to contact the server
HttpClient http;
const char method[] = "POST";

// Headers currently need to be set at init, useful for API keys etc.
http_header_t headers[] = {
    //{ "Content-Type", "application/json" },
    //  { "Accept" , "application/json" },
    { "Accept" , "*/*"},
    { NULL, NULL } // NOTE: Always terminate headers will NULL
};

http_request_t request;
http_response_t response;

void setup() {
    Serial.begin(9600);
}

void loop() {
    if (nextTime > millis()) {
        return;
    }

    Serial.println();
    Serial.println("Application>\tStart of Loop.");
    // Request path and body can be set at runtime or at setup.
    //request.hostname = "www.timeapi.org";
    request.ip = {192,168,1,12};
    request.port = 9000;
    request.path = "/post_beacon";

    // The library also supports sending a body with your request:
    request.body = "{\"accuracy\":1.4611688320646095,\"clientId\":\"317a1f90-a4c0-11e3-88e2-9d56db0bd2c7\",\"id\":\"aa50a994-b062-4d4d-8b5c-188e61343b2e\",\"majorId\":2,\"minorId\":4,\"power\":-74,\"proximity\":2,\"proximityId\":\"2f234454-cf6d-4a0f-adf2-f4911ba9ffa6\",\"remoteTime\":1402723272866,\"rssi\":-78,\"userId\":\"24b0fc20-a3df-11e3-996e-d50f69be8660\"},{\"accuracy\":1.01076,\"clientId\":\"317a1f90-a4c0-11e3-88e2-9d56db0bd2c7\",\"id\":\"abee4413-f276-4e1b-94f6-c93ce95cd2b6\",\"majorId\":2,\"minorId\":7,\"power\":-76,\"proximity\":2,\"proximityId\":\"2f234454-cf6d-4a0f-adf2-f4911ba9ffa6\",\"remoteTime\":1402723272866,\"rssi\":-76,\"userId\":\"24b0fc20-a3df-11e3-996e-d50f69be8660\"}";

    // Get request
    http.post(request, response, headers);
    Serial.print("Application>\tResponse status: ");
    Serial.println(response.status);

    Serial.print("Application>\tHTTP Response Body: ");
    Serial.println(response.body);

    nextTime = millis() + 10000;
}

And here is the error trace when i compile:

In file included from ../inc/spark_wiring.h:30:0,
from ../inc/spark_wiring_stream.h:36,
from ../inc/spark_wiring_client.h:24,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning "Defaulting to Release Build" [-Wcpp]
#warning "Defaulting to Release Build"
^
In file included from ../../core-common-lib/CC3000_Host_Driver/evnt_handler.h:38:0,
from ../inc/spark_wlan.h:33,
from ../inc/main.h:38,
from ../inc/spark_utilities.h:30,
from ../inc/spark_wiring.h:34,
from ../inc/spark_wiring_stream.h:36,
from ../inc/spark_wiring_client.h:24,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
../../core-common-lib/CC3000_Host_Driver/socket.h:146:0: warning: "fd_set" redefined [enabled by default]
#define fd_set _types_fd_set_cc3000
^
In file included from /opt/gcc_arm/arm-none-eabi/include/stdio.h:47:0,
from ../inc/spark_wiring_print.h:30,
from ../inc/spark_wiring_client.h:23,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
/opt/gcc_arm/arm-none-eabi/include/sys/types.h:256:0: note: this is the location of the previous definition
#define fd_set _types_fd_set
^
In file included from ../../core-common-lib/CC3000_Host_Driver/evnt_handler.h:38:0,
from ../inc/spark_wlan.h:33,
from ../inc/main.h:38,
from ../inc/spark_utilities.h:30,
from ../inc/spark_wiring.h:34,
from ../inc/spark_wiring_stream.h:36,
from ../inc/spark_wiring_client.h:24,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
../../core-common-lib/CC3000_Host_Driver/socket.h:162:0: warning: "FD_SET" redefined [enabled by default]
#define FD_SET(fd, fdsetp) __FD_SET (fd, fdsetp)
^
In file included from /opt/gcc_arm/arm-none-eabi/include/stdio.h:47:0,
from ../inc/spark_wiring_print.h:30,
from ../inc/spark_wiring_client.h:23,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
/opt/gcc_arm/arm-none-eabi/include/sys/types.h:258:0: note: this is the location of the previous definition
# define FD_SET(n, p) ((p)->fds_bits[(n)/NFDBITS] |= (1L << ((n) % NFDBITS)))
^
In file included from ../../core-common-lib/CC3000_Host_Driver/evnt_handler.h:38:0,
from ../inc/spark_wlan.h:33,
from ../inc/main.h:38,
from ../inc/spark_utilities.h:30,
from ../inc/spark_wiring.h:34,
from ../inc/spark_wiring_stream.h:36,
from ../inc/spark_wiring_client.h:24,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
../../core-common-lib/CC3000_Host_Driver/socket.h:163:0: warning: "FD_CLR" redefined [enabled by default]
#define FD_CLR(fd, fdsetp) __FD_CLR (fd, fdsetp)
^
In file included from /opt/gcc_arm/arm-none-eabi/include/stdio.h:47:0,
from ../inc/spark_wiring_print.h:30,
from ../inc/spark_wiring_client.h:23,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
/opt/gcc_arm/arm-none-eabi/include/sys/types.h:259:0: note: this is the location of the previous definition
# define FD_CLR(n, p) ((p)->fds_bits[(n)/NFDBITS] &= ~(1L << ((n) % NFDBITS)))
^
In file included from ../../core-common-lib/CC3000_Host_Driver/evnt_handler.h:38:0,
from ../inc/spark_wlan.h:33,
from ../inc/main.h:38,
from ../inc/spark_utilities.h:30,
from ../inc/spark_wiring.h:34,
from ../inc/spark_wiring_stream.h:36,
from ../inc/spark_wiring_client.h:24,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
../../core-common-lib/CC3000_Host_Driver/socket.h:164:0: warning: "FD_ISSET" redefined [enabled by default]
#define FD_ISSET(fd, fdsetp) __FD_ISSET (fd, fdsetp)
^
In file included from /opt/gcc_arm/arm-none-eabi/include/stdio.h:47:0,
from ../inc/spark_wiring_print.h:30,
from ../inc/spark_wiring_client.h:23,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
/opt/gcc_arm/arm-none-eabi/include/sys/types.h:260:0: note: this is the location of the previous definition
# define FD_ISSET(n, p) ((p)->fds_bits[(n)/NFDBITS] & (1L << ((n) % NFDBITS)))
^
In file included from ../../core-common-lib/CC3000_Host_Driver/evnt_handler.h:38:0,
from ../inc/spark_wlan.h:33,
from ../inc/main.h:38,
from ../inc/spark_utilities.h:30,
from ../inc/spark_wiring.h:34,
from ../inc/spark_wiring_stream.h:36,
from ../inc/spark_wiring_client.h:24,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
../../core-common-lib/CC3000_Host_Driver/socket.h:165:0: warning: "FD_ZERO" redefined [enabled by default]
#define FD_ZERO(fdsetp) __FD_ZERO (fdsetp)
^
In file included from /opt/gcc_arm/arm-none-eabi/include/stdio.h:47:0,
from ../inc/spark_wiring_print.h:30,
from ../inc/spark_wiring_client.h:23,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
/opt/gcc_arm/arm-none-eabi/include/sys/types.h:261:0: note: this is the location of the previous definition
# define FD_ZERO(p) (__extension__ (void)({ \
^
In file included from ../inc/spark_wiring.h:37:0,
from ../inc/spark_wiring_stream.h:36,
from ../inc/spark_wiring_client.h:24,
from ../inc/spark_wiring_tcpclient.h:29,
from /HttpClient.h:5,
from /HttpClient.cpp:1:
../inc/spark_wiring_ipaddress.h: In member function 'IPAddress::operator uint32_t()':
../inc/spark_wiring_ipaddress.h:53:52: warning: dereferencing type-punned pointer will break strict-aliasing rules [-Wstrict-aliasing]
operator uint32_t() { return *((uint32_t*)_address); };
^
../inc/spark_wiring_ipaddress.h: In member function 'bool IPAddress::operator==(const IPAddress&)':
../inc/spark_wiring_ipaddress.h:54:72: warning: dereferencing type-punned pointer will break strict-aliasing rules [-Wstrict-aliasing]
bool operator==(const IPAddress& addr) { return (*((uint32_t*)_address)) == (*((uint32_t*)addr._address)); };
^
../inc/spark_wiring_ipaddress.h:54:105: warning: dereferencing type-punned pointer will break strict-aliasing rules [-Wstrict-aliasing]
bool operator==(const IPAddress& addr) { return (*((uint32_t*)_address)) == (*((uint32_t*)addr._address)); };
^
In file included from ../inc/spark_wiring.h:30:0,
from /application.h:6,
from /webservice-with-http-lib.cpp:1:
../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning "Defaulting to Release Build" [-Wcpp]
#warning "Defaulting to Release Build"
^
In file included from ../inc/spark_wiring.h:37:0,
from /application.h:6,
from /webservice-with-http-lib.cpp:1:
../inc/spark_wiring_ipaddress.h: In member function 'IPAddress::operator uint32_t()':
../inc/spark_wiring_ipaddress.h:53:52: warning: dereferencing type-punned pointer will break strict-aliasing rules [-Wstrict-aliasing]
operator uint32_t() { return *((uint32_t*)_address); };
^
../inc/spark_wiring_ipaddress.h: In member function 'bool IPAddress::operator==(const IPAddress&)':
../inc/spark_wiring_ipaddress.h:54:72: warning: dereferencing type-punned pointer will break strict-aliasing rules [-Wstrict-aliasing]
bool operator==(const IPAddress& addr) { return (*((uint32_t*)_address)) == (*((uint32_t*)addr._address)); };
^
../inc/spark_wiring_ipaddress.h:54:105: warning: dereferencing type-punned pointer will break strict-aliasing rules [-Wstrict-aliasing]
bool operator==(const IPAddress& addr) { return (*((uint32_t*)_address)) == (*((uint32_t*)addr._address)); };
^
/HttpClient.o: In function `HttpClient::request(http_request_t&, http_response_t&, http_header_t*, char const*)':
/spark/compile_server/shared/workspace/worker_2/core-firmware/build//HttpClient.cpp:270: undefined reference to `String::operator=(String&&)'
collect2: error: ld returned 1 exit status
make: *** [10483202883edd402bd84c0334b858924f3e4690e599173dcf2a05d20f49.elf] Error 1

Error: Could not compile. Please review your code.

Yes — based on feedback I changed the order in the code and then forgot to change the readme but I corrected it now. Sorry about that.

I didn't dig too deeply in to your code, but it seems to me like a linker error, so it could be that you need to add the library to your buildfile.

1 Like

By the way, I think you should keep ip as separate parameter and actually let provide BOTH host and IP. Providing both would mean connection to IP with headers set to host. This will analog for “hosts” file entry for this ip to use with local servers.

1 Like

Hi

Some good work here.

Having used the Electric Imp (which has great server side support), I am looking at an open source alternative and the Spark could the the ideal option - anyone from the Spark team available to comment on timelines for a production ready http library?

Thank you
Kevin

Hi Kevin,

I’m not sure Spark is making an official http library, but it’s not really that difficult so don’t let it be holding you back!
As you can perhaps tell from the lack of updates, my focus has shifted a bit from developing this library, but thanks to people who have tested it and helped fixing bugs it’s actually reasonably stable. It’s not well-tested enough for production and could surely be improved but try it out in development and see how far it will take you.

Cheers,
Nils

1 Like

Hi Nils

Thanks for your reply - I think you are correct - I should give it a go!
I will be back for advice if I hit any issues

Cheers

Kevin

Hey All!

We’re not working on an officially supported HTTP library just yet, but we love that awesome community members like @nmattisson jumped in and created one to share. :slight_smile:

Http requests are a really common need, so we want to make sure it’s as easy as possible, and we might be able to put some resources towards it after our round of hiring, but nothing official yet.

Thanks!
David

1 Like

I found that the core has problem dealing with Strings https://github.com/spark/core-firmware/issues/146
I had to comment this 2 lines to make it work.
//String raw_response(buffer);
//String statusCode = raw_response.substring(9,12);
One option would be to parse the state not using strings.

1 Like

That library was created for the Spark core and has worked for me in the past.

Can you say what error you are getting?

1 Like

Hi @bko,
The error I’m getting is this error: undefined reference to `String::operator=(String&&)’ and only when compiling from the IDE. I found someone else having a similar issue http://community.spark.io/t/having-trouble-compiling-code-on-spark-that-works-on-arduino/3328/6 and I decide to comment the Strings and the problem was gone… now I’m trying to work around it.
Thanks a lot for your help!

1 Like

OK, I played with this for a few minutes and I think I have seen this before. The struct with the string inside being assigned on line 261 is the real problem. Try changing that section like this–make the string empty and then append to it, so we don’t change the pointer in the struct.

   client.stop();

    String raw_response(buffer);

    // Not super elegant way of finding the status code, but it works.
    String statusCode = raw_response.substring(9,12);
    

    #ifdef LOGGING
    Serial.print("HttpClient>\tStatus Code: ");
    Serial.println(statusCode);
    #endif

    int bodyPos = raw_response.indexOf("\r\n\r\n");
    if (bodyPos == -1) {
        #ifdef LOGGING
        Serial.println("HttpClient>\tError: Can't find HTTP response body.");
        #endif

        return;
    }
    // Return the entire message body from bodyPos+4 till end.
    aResponse.body = "";
    aResponse.body += raw_response.substring(bodyPos+4);
    aResponse.status = atoi(statusCode.c_str());
}
2 Likes

Thanks! That worked. It would be great to add it to the Github repository.

I want to thanks Brian (@bko) again and I have an other, hopefully, easy question.

I have been trying to connect to a https site, should I use port 443 right? It seams like the request is sent but I get no response but I do If I use CURL on the terminal.

Finally, I noticed that the IDE supports libraries.
(Importing a library to the IDE begins by creating an open source GitHub repository with a spark.json file.)

Hi @juano2310

Yes 443 is the default TLS/SSL port for HTTPS. But you know the core does not do HTTPS at this time, right? Are you writing your own?

Yes, IDE libraries are a major win. We should convince @nmattisson to add the JSON to his library so it can be used directly in the web IDE!

1 Like