@kennethlimcp, I only ended up getting it working on Win 7x32. I did not get the VC++ 2010 compiling the URSA code so it would not give me errors. However, once I got the 32bit compiled files on the Win7x32 system, I copied them over to the Win8.1x64 system and was able to get the local server working there.
Ok I decided to go for visual studio 2013 instead.
Turns out the problem is related to npm modules that have node-gyp as a dependency (ursa in this case). I tried a lot of things so I donāt know exactly wich one worked for me.
- I reinstalled python and added the python ENV manually
- Started visual studio command prompt and did a node-gyp configure from there
- tried npm install -g node-gyp -msvs_version=2013
- npm install -g restify
And after losing my mind for some time I tried npm install again and it worked
Sorry for not giving any specific solution for anyone with the same problem, I will try to install local cloud server on a fresh windows 8.1 install again
Haha thanks, it bugs me that I donāt know what was the solution but iāll find out
Hi @kennethlimcp,
I was wondering what is the total number of cores/photons that can be served from a single machine running the local cloud?
Please let me know about the handling capacity of the local cloud.
thanks,
@Akash
i donāt think thereās a limit to the number of cores that can be served but the key word here is single machine.
It does depend on what specifications you are referring to.
eg. AWS micro, small. large instances etc.
You will definitely need things like multi-threaded and load balancing if you are looking at something of scale similar to the spark
May i know whatās your use case about?
I was going over spark.io and I came across following:
and we even provide an open source version of our cloud services for small deployments (up to thousands of devices)
Can you suggest what minimum configurations are required to get the above mentioned quantity of cores working at a time?
And I certainly don't own any AWS yet , I might get one soon.
But I do look forward to scale, however, everything is in a nascent stage and I am looking for my options.
Also, just out of curiosity, what maximum number of cores can be served from my windows PC (nominal configurations for personal use), running spark local cloud? (no load balancing, just a single PC).
I understand my question are very basic, still I don't want to keep myself guessing about these things.
thanks,
@Akash
Itās a tough question and load varies account machines but i safe number like 50-100 devices sounds ok though.
Also, you will probably meet IP address issues as well on your local network router but thatās solvable. Donāt think anyone had more than 10 cores on a local so far
Hi @Akash
A Windows PC has a maximum number of TCP connections and a maximum number of connections per process which is smaller. I believe that is around 16000 connections so that would form a hard limit.
I donāt think anybody had ever tested this at this scale with the local cloud.
āgitā is not recognized as an internal or external command,
operable program or batch file.
i installed the stuff but itās not working any help?
Set-Location : Cannot find path āC:\Users\Student\Documents\GitHub\spark-server
\jsā because it does not exist.
At line:1 char:3
- cd <<<< spark-server/js
- CategoryInfo : ObjectNotFound: (C:\Users\Studenā¦spark-server
js:String) [Set-Location], ItemNotFoundException - FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.SetLo
cationCommand
- CategoryInfo : ObjectNotFound: (C:\Users\Studenā¦spark-server
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
got this
@kennethlimcp I got this problem. What should I do sir?
by the way, let me tell you what Iāve already done. I have already install add the path of both open-ssl and dfu. And I copy āgit clone https://github.com/spark/spark-server.gitā to git shell, run the node.js as administrator.
ps.Thank you for your tutorials like installing DFU or installing node.js. They help me a lot.
Did you git clone
in the C drive? Otherwise you need to cd into the directory containing the spark-protocol
folder first.
or do you mean this ?
And sir, do I have to create the folder js inside the sparker-server?
So it seems like this tutorial needs to be updated. Thereās no longer js
folder.
All you need to do is to CD in to the spark-server
directory and run npm install
followed by node main.js
Do I have to type npm install in normal cmd not in asminstrator? and Do I have to connect the electron while doing this?
The electron does not work with the local so you might not want to waste time on it.
Does it mean that electron can only use particle sim and particle cloud? It wonāt work even on google cloud, will it? And how about thingspeak?
It depends on what you are trying to achieve. Webhooks is available for the electron to publish data and send to google cloud. As for thingspeak, there should be a library that allows you to do that.