Configuring Spark Core on a closed Wi-Fi network

Hello,

I am a student studying Electrical Engineering and I am planning on using the Spark Core in my senior capstone project. Our team is developing a prototype of a magnetic field monitoring system for a professor at our school. We were given a specific sensor to use and are planning on using the Spark Core to forward the sensor data to a server to be processed and displayed. The network admins at our school have agreed to help us set up a closed Wi-Fi network independent of the schools network. This means we do not have access to the internet and will not be able to utilize the Spark Cloud feature.

This brings me to my question, is there any documentation on connecting directly to the Spark Core via TCP programming without using the Spark Cloud?

Thanks for the help,

Tim

1 Like

Hey Tim,

The best way to do this would be to run the local version of the Spark Cloud on your network. We don’t yet have documentation for this yet, but we will in the near future.

Z

Hey Zach,

Thanks for the quick response. This sounds like a good solution. Do you have any estimation of when you will be publishing this information?

Tim

We’re publishing everything at delivery in early November. So coming up very soon!

Z

Awesome, thanks! Looking forward to using the Spark Core!

Hey Zach,

I’ve received my cores today. They look great. What is the actual state of the local Spark Cloud version. Is they available?

Best regards
Dominik

Hi @wittmer,

We’re still working on the Spark local cloud. I’ve been focusing on the comand-line-tool first because switching servers on the core requires juggling some keys and firmware – i.e. it would be a bit frustrating to use the local server only right now without a tool to make it easier. If it wasn’t as secure it’d be easier, but where is the fun in that? :smile:

I am hoping this can be ready soon (in the next few weeks), but I don’t have a hard date yet.

Thanks,
David

Hi David

Thanks for your answer. Since I will not have the posibility to access to the cloud, I will use the TCP functionality until a local cloud is available.

Dominik

1 Like

Hello @Dave and @zach

I recently received my first Spark Core and have been messing around with it for the past couple weeks. It is amazing! I’m looking forward to using additional cores to implement my sensor network.

Do you plan on delivering instructions on how to host the local Spark Cloud when it is ready? I have spent some good time googling how to host a node.js app and it seems relatively complex. (at least for an EE instead of a web dev/ programmer haha)

Also, what is the release date for the local Cloud looking like?

Thanks for all the help!

Tim

1 Like

No official release date; it’s towards the top of our priority list but there are a few important bug fixes and features that have taken priority. We just wrapped up a sprint on Friday and will be starting another on Monday, so it’s possible it might make its way into this sprint (and if so, be completed by Feb 7). Otherwise I would expect that it will be in the following sprint, and be completed by Feb 21.

Hey @zach

Any updates regarding the local Spark Cloud?

Thanks for the update!

We had a great discussion recently here :smile:

https://community.particle.io/t/where-is-the-source-code-for-the-cloud/1381/last?redirected=true
2 Likes

@kennethlimcp Thanks for the link!

That type of information is exactly what i’m looking for. Do you happen to know if TCP socket communication in a local Wi-Fi network is even possible without the Spark Core first authenticating with the Spark Cloud?

I’m hoping my whole project isn’t 100% dependent on the release of the local cloud but it’s starting to look more and more like that’s the case haha :smile:

Thanks again for the info!

Possible if… you are compiling the code locally really.

This is going to change real soon when the new code is being pushed to the Web IDE! :smiley:

Just have a little more patience hahahahaha

Definitely the new changes pushed to the Web IDE is going to be a game changer as it allows you to connect/disconnect to the Spark cloud and do your own stuff.

Till then!

You can do this now by including “spark_disable_wlan.h” and “spark_disable_cloud.h” in your code, and then use Spark.connect() / etc to manage your connection – some more info here: http://docs.spark.io/#/firmware/cloud-functions-connection-management

Thanks,
David

Thanks @Dave!

Does this require the spark core to authenticate with the spark cloud every time it is powered on before you can use the spark.disconnect() function?

Hi @tsteltzer,

I believe including those library files tell the Core to not handshake with the cloud until you call ‘connect’. :smile:

Thanks!
David

@Dave

Thanks again for the quick response. This sounds incredible! I’m really hoping this will work.

I tried to include the header files in my Spark firmware and unfortunately my code wouldn’t compile…

Include statements:

IDE Compile error:

Do you have any idea what I might be doing wrong? I found both of the header files you were referencing at:
https://github.com/spark/core-firmware/tree/master/inc

Do I have to manually include these somehow or what do I need to do?

Thanks for all the help!

Hi @tsteltzer,

Good question! Sorry about the confusion, my bad! Those changes are present in the master branch of the core-firmware, but that branch is different from what the build IDE uses. We want to make sure that whatever is built on the IDE is stable and is built using well-tested firmware, because the master branch changes frequently, we lock the build IDE to the compile-server2 branch. About once a sprint (every 2 weeks or so), we rollout the recent improvements to that branch and then any new firmware built against that will automatically include those changes.

We’re in the process of testing the next firmware release, which I’m hoping will happen sometime next week. You can get these changes now if you’re compiling and developing locally.

Thanks!
David

@Dave, thanks again for the response!

Cool, that makes a lot more sense now. I will look into loading firmware via USB in the meantime.

How could I find out when the new firmware is pushed to the webIDE?