Tutorial: Local Cloud on Windows [25 July 2014]

@kennethlimcp, I only ended up getting it working on Win 7x32. I did not get the VC++ 2010 compiling the URSA code so it would not give me errors. However, once I got the 32bit compiled files on the Win7x32 system, I copied them over to the Win8.1x64 system and was able to get the local server working there.

2 Likes

Ok I decided to go for visual studio 2013 instead.

Turns out the problem is related to npm modules that have node-gyp as a dependency (ursa in this case). I tried a lot of things so I donā€™t know exactly wich one worked for me.

  • I reinstalled python and added the python ENV manually
  • Started visual studio command prompt and did a node-gyp configure from there
  • tried npm install -g node-gyp -msvs_version=2013
  • npm install -g restify

And after losing my mind for some time I tried npm install again and it worked

Sorry for not giving any specific solution for anyone with the same problem, I will try to install local cloud server on a fresh windows 8.1 install again

Nice work @axel. I think my brain just exploded on that one :stuck_out_tongue:

1 Like

Haha thanks, it bugs me that I donā€™t know what was the solution but iā€™ll find out

1 Like

Hi @kennethlimcp,

I was wondering what is the total number of cores/photons that can be served from a single machine running the local cloud?

Please let me know about the handling capacity of the local cloud.

thanks,
@Akash

@Akash,

i donā€™t think thereā€™s a limit to the number of cores that can be served but the key word here is single machine.

It does depend on what specifications you are referring to.

eg. AWS micro, small. large instances etc.

You will definitely need things like multi-threaded and load balancing if you are looking at something of scale similar to the spark :cloud:

May i know whatā€™s your use case about? :slight_smile:

@kennethlimcp

I was going over spark.io and I came across following:

and we even provide an open source version of our cloud services for small deployments (up to thousands of devices)

Can you suggest what minimum configurations are required to get the above mentioned quantity of cores working at a time?

And I certainly don't own any AWS yet :stuck_out_tongue_closed_eyes:, I might get one soon.

But I do look forward to scale, however, everything is in a nascent stage and I am looking for my options.

Also, just out of curiosity, what maximum number of cores can be served from my windows PC (nominal configurations for personal use), running spark local cloud? (no load balancing, just a single PC).

I understand my question are very basic, still I don't want to keep myself guessing about these things.

:blush:

thanks,
@Akash

Itā€™s a tough question and load varies account machines but i safe number like 50-100 devices sounds ok though.

Also, you will probably meet IP address issues as well on your local network router but thatā€™s solvable. Donā€™t think anyone had more than 10 cores on a local :cloud: so far :wink:

Hi @Akash

A Windows PC has a maximum number of TCP connections and a maximum number of connections per process which is smaller. I believe that is around 16000 connections so that would form a hard limit.

I donā€™t think anybody had ever tested this at this scale with the local cloud.

2 Likes

ā€˜gitā€™ is not recognized as an internal or external command,
operable program or batch file.

i installed the stuff but itā€™s not working any help?

Set-Location : Cannot find path ā€˜C:\Users\Student\Documents\GitHub\spark-server
\jsā€™ because it does not exist.
At line:1 char:3

  • cd <<<< spark-server/js
    • CategoryInfo : ObjectNotFound: (C:\Users\Studenā€¦spark-server
      js:String) [Set-Location], ItemNotFoundException
    • FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.SetLo
      cationCommand

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
got this

@kennethlimcp I got this problem. What should I do sir?

by the way, let me tell you what Iā€™ve already done. I have already install add the path of both open-ssl and dfu. And I copy ā€œgit clone https://github.com/spark/spark-server.gitā€ to git shell, run the node.js as administrator.

ps.Thank you for your tutorials like installing DFU or installing node.js. They help me a lot.

Did you git clone in the C drive? Otherwise you need to cd into the directory containing the spark-protocol folder first.

Do you mean this one? So, is that mean I have installed it in C:/Users/Gookgik ?

or do you mean this ?

And sir, do I have to create the folder js inside the sparker-server?

So it seems like this tutorial needs to be updated. Thereā€™s no longer js folder.

All you need to do is to CD in to the spark-server directory and run npm install followed by node main.js

Please help me sir, itā€™s full of error.

Do I have to type npm install in normal cmd not in asminstrator? and Do I have to connect the electron while doing this?

The electron does not work with the local :cloud: so you might not want to waste time on it.

Does it mean that electron can only use particle sim and particle cloud? It wonā€™t work even on google cloud, will it? And how about thingspeak?

It depends on what you are trying to achieve. Webhooks is available for the electron to publish data and send to google cloud. As for thingspeak, there should be a library that allows you to do that.

1 Like