Documentation Needs

There has been much conversation regarding updating documentation. This thread will act as an on-going list of things that need to be documented or have the documentation improved. I invite any Spark Elite or Spark employees to edit this post as needed when new documentation needs arise or are completed. All users are invited to contribute pull requests to the official documentation repository hosted on GitHub. Please keep replies to this topic as concise as possible. If debate is needed, please start a new thread (a la “Documentation - UDP Issues”). I think prepending new topic subjects with "Documentation - " will help everyone visually filter the list of topics. I will propose an official Documentation category the Spark Team that will eliminate the need to add "Documentation - " to the topic subject.

To-do

Completed

6 Likes

OK, and should anyone want to add to the list but lacks the permission to edit the main, initial posting of this thread, then would it be OK if they should comment as I do here, and someone with the necessary permissions will delete same (or mark same as “included” or something) and fold the information into the first posting??? [This thread should be a wiki :-)]

3 Likes

There is a neat trick buried in this thread explaining how to turn off the status LED except for exceptions: Disabling Status LED - General - Particle

There remains undocumented (and perhaps unimplemented) the effect of the TTL param in Spark.publish()
Meaning of ttl parameter in Spark.publish() - Firmware - Particle - There are other questions too: Publish & subscribe semantics & documentation - Firmware - Particle
It has recently become clear TTL is unimplemented.

There is uncertainty as to how much of the gcc build environment one can rely on when compiling in the cloud or even locally. E.G. Is UINT_MAX reliably the max unsigned int on the Spark Core? Which of the Posix/ANSI/Standard C functions can be used? (Unix manual ch 2) Which libc functions? (Unix manual ch 3). Which #defines can be relied on? Which #defines are additionally defined?

Related is the definition of the language itself, and portability issues, and the fact that not even all the functions available are listed in the Spark docs:

2 Likes

NICE!!!

Some of that stuff is way over my head. I’ll have to leave it to the smarter guys to update the to-do list with a summary that makes sense to you smart folks.

2 Likes

Here is some useful installation tips for Ubuntu 14.04. It could be split up a bit for those who don’t wat the entire tool chain, just spark-cli:

Here are spark-cli install tips for Ubuntu 12.04

2 Likes

Documentation bug:

Can this topic be pinned to stay at the top? Why not?

It is unclear in the docs as to whether Time.zone() is reset by Spark.synctime().

Reading the source in github, Time.zone(x) is a set function–it does not return any value. It lets the programmer set the current time zone. The value is simply multiplied by 3600 and stored as an integer number of seconds.

Spark.syncTime() does not know about or change the time zone.

Thank you, but this isn’t clear in the docs, and needs to be made clear. There.

The doc is actually completely correct. It is the example that is wrong.

The examples are a crucial part of the docs. Much of the docs is completely mysterious or would be, without the examples. The particular example is wrong and that was previously identified by someone else but the comment is misleading about DST and creates an incorrect impression. And, futhermore, my question as to the possible interplay between timeSync and zone is a legitimate one, one that any programmer worth his salt would check the docs for, and one not settled by the docs, with or without the examples.

It all needs fixing, do you say different?

I take full responsibility for messing up the time docs. I’m definitely going to double-check all of the returns more carefully!

I have a sick kiddo at home, so my free time has been spent cleaning up “yuck” and waiting for the next bout. Hopefully she’ll recover, and it’ll stay isolated to just one! Otherwise, my documentation weekend is shot. :frowning:

I hope all goes well but please don’t worry about the issue of blame. Only those who do nothing don’t make mistakes. Rarely is anything correct first time.

1 Like

I knew you were a sweetheart underneath it all! But, seriously, thank you for the kind words.

I'm going to update the to-do list, but I don't think I have the energy left to write docs tonight. You see how the last docs I wrote late at night went!

Looks like @zach pushed the new http://docs.spark.io/ changes live!

YMMV, as DNS is still propagating. But it should be live for everyone by morning!

Looking good in Singapore :smiley:

@Zach, Can you do the honors of the new features in the updated docs?

After you get some sleep, of course!

@zach the new documentation setup is amazing! You whipped this up this weekend? Mad skillz!

BTW I followed the setup guide and after the npm install part, it went ahead and downloaded like every package in existence it seems like. The package.json doesn’t seems to call for all of that, so maybe grunt and assemble just need a ton of supporting packages??

Got this running locally in a snap. Then I updated firmware.md to test the live update feature, and got this error which I’m unsure about:


// HERE YOU CAN SEE IS WHEN THE GRUNT SERVER COMMAND STARTED THE SERVER
Running "connect:livereload" (connect) task
Started connect web server on http://localhost:9000

Running "watch" task
Waiting...OK
>> File "src\content\firmware.md" changed.

Running "coffeelint:grunt" (coffeelint) task
>> 1 file lint free.

Running "clean:dest" (clean) task
Cleaning build...Fatal error: watch EPERM
Fatal error: watch EPERM

After I forced the process to end, and re-ran grunt build and grunt server it all came up just fine. Actually just restarting the server seems to pick up the changes just fine as well since they are static resources.