Error: no matching function for call to 'CloudClass::variable

I was working on a piece of code that declares a few cloud variables, e.g.:

Particle.function("SetMode", SetMode);
Particle.variable("wifi",          &wifi,                  INT);
Particle.variable("tHour",         &tHour,                 INT);
Particle.variable("speed",         &speedIndex,            INT);
Particle.variable("brightness",    &brightness,            INT);
Particle.variable("modeList",      &modeList,              STRING);
Particle.variable("mode",          &currentModeName,       STRING);

Had this working for weeks, and now (literally, just now) without me having even touched any of that code, the cloud IDE started spitting the following error message:

error: no matching function for call to 'CloudClass::variable(const char [9], char ()[622], const CloudVariableTypeString&)'
void colorChaser(uint32_t c);
note: candidates are:
In file included from ../wiring/inc/spark_wiring.h:46:0,
from ./inc/application.h:36,
from sparkpixels.cpp:36:
../wiring/inc/spark_wiring_cloud.h:48:24: note: static bool CloudClass::variable(const char
, const uint8_t*, const CloudVariableTypeString&)
static inline bool variable(const char varKey, const uint8_t userVar, const CloudVariableTypeString& userVarType)
^
../wiring/inc/spark_wiring_cloud.h:48:24: note: no known conversion for argument 2 from 'char ()[622]' to 'const uint8_t {aka const unsigned char*}'
../wiring/inc/spark_wiring_cloud.h:53:45: note: static bool CloudClass::variable(const char*, typename T::varref, const T&) [with T = CloudVariableTypeString; typename T::varref = const char*]
template static inline bool variable(const char varKey, const typename T::varref userVar, const T& userVarType)
^
../wiring/inc/spark_wiring_cloud.h:53:45: note: no known conversion for argument 2 from 'char (
)[622]' to 'CloudVariableTypeString::varref {aka const char*}'
In file included from ../wiring/inc/spark_wiring.h:46:0,
from ./inc/application.h:36,
from sparkpixels.cpp:36:
../wiring/inc/spark_wiring_cloud.h:58:24: note: static bool CloudClass::variable(const char*, const uint32_t*, const CloudVariableTypeInt&)
static inline bool variable(const char varKey, const uint32_t userVar, const CloudVariableTypeInt& userVarType)
^
../wiring/inc/spark_wiring_cloud.h:58:24: note: no known conversion for argument 2 from 'char ()[622]' to 'const uint32_t {aka const long unsigned int*}'

error: no matching function for call to 'CloudClass::variable(const char [5], char ()[64], const CloudVariableTypeString&)'
void pulse_oneColorAll(void);
note: candidates are:
In file included from ../wiring/inc/spark_wiring.h:46:0,
from ./inc/application.h:36,
from sparkpixels.cpp:36:
../wiring/inc/spark_wiring_cloud.h:48:24: note: static bool CloudClass::variable(const char
, const uint8_t*, const CloudVariableTypeString&)
static inline bool variable(const char varKey, const uint8_t userVar, const CloudVariableTypeString& userVarType)
^
../wiring/inc/spark_wiring_cloud.h:48:24: note: no known conversion for argument 2 from 'char ()[64]' to 'const uint8_t {aka const unsigned char*}'
../wiring/inc/spark_wiring_cloud.h:53:45: note: static bool CloudClass::variable(const char*, typename T::varref, const T&) [with T = CloudVariableTypeString; typename T::varref = const char*]
template static inline bool variable(const char varKey, const typename T::varref userVar, const T& userVarType)
^
../wiring/inc/spark_wiring_cloud.h:53:45: note: no known conversion for argument 2 from 'char (
)[64]' to 'CloudVariableTypeString::varref {aka const char*}'
In file included from ../wiring/inc/spark_wiring.h:46:0,
from ./inc/application.h:36,
from sparkpixels.cpp:36:
../wiring/inc/spark_wiring_cloud.h:58:24: note: static bool CloudClass::variable(const char*, const uint32_t*, const CloudVariableTypeInt&)
static inline bool variable(const char varKey, const uint32_t userVar, const CloudVariableTypeInt& userVarType)
^
../wiring/inc/spark_wiring_cloud.h:58:24: note: no known conversion for argument 2 from 'char ()[64]' to 'const uint32_t {aka const long unsigned int*}'

error: no matching function for call to 'CloudClass::variable(const char [6], char ()[200], const CloudVariableTypeString&)'
void police_light_strobo(void);
note: candidates are:
In file included from ../wiring/inc/spark_wiring.h:46:0,
from ./inc/application.h:36,
from sparkpixels.cpp:36:
../wiring/inc/spark_wiring_cloud.h:48:24: note: static bool CloudClass::variable(const char
, const uint8_t*, const CloudVariableTypeString&)
static inline bool variable(const char varKey, const uint8_t userVar, const CloudVariableTypeString& userVarType)
^
../wiring/inc/spark_wiring_cloud.h:48:24: note: no known conversion for argument 2 from 'char ()[200]' to 'const uint8_t {aka const unsigned char*}'
../wiring/inc/spark_wiring_cloud.h:53:45: note: static bool CloudClass::variable(const char*, typename T::varref, const T&) [with T = CloudVariableTypeString; typename T::varref = const char*]
template static inline bool variable(const char varKey, const typename T::varref userVar, const T& userVarType)
^
../wiring/inc/spark_wiring_cloud.h:53:45: note: no known conversion for argument 2 from 'char (
)[200]' to 'CloudVariableTypeString::varref {aka const char*}'
In file included from ../wiring/inc/spark_wiring.h:46:0,
from ./inc/application.h:36,
from sparkpixels.cpp:36:
../wiring/inc/spark_wiring_cloud.h:58:24: note: static bool CloudClass::variable(const char*, const uint32_t*, const CloudVariableTypeInt&)
static inline bool variable(const char varKey, const uint32_t userVar, const CloudVariableTypeInt& userVarType)
^
../wiring/inc/spark_wiring_cloud.h:58:24: note: no known conversion for argument 2 from 'char ()[200]' to 'const uint32_t {aka const long unsigned int*}'
sparkpixels.cpp:272:21: warning: zero-length gnu_printf format string [-Wformat-zero-length]
defaultColor = strip.Color((255 * .5),(255 * .5),(60 * .5)); // This seems close to incandescent color
^
sparkpixels.cpp:272:21: warning: zero-length gnu_printf format string [-Wformat-zero-length]
sparkpixels.cpp: In function 'int colorAll(uint32_t)':
sparkpixels.cpp:437:25: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
//Used in all modes to set the brightness, show the lights, process Spark events and delay
^
sparkpixels.cpp: In function 'void colorChaser(uint32_t)':
sparkpixels.cpp:484:17: warning: unused variable 'j' [-Wunused-variable]
}
^
sparkpixels.cpp: In function 'void fadeInToColor(uint32_t, Color)':
sparkpixels.cpp:1088:21: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
Color retVal;
^
make[1]: *** [../build/target/user/platform-6sparkpixels.o] Error 1
make: *** [user] Error 2

Error: Could not compile. Please review your code.

This just stopped working. As if something in the cloud or somewhere, was changed without my knowledge.
Now I can't compile my code. Can't even decypher what the heck these cryptic Cloud::variable messages refer to - it's like someone decided to remove the function signature from the cloud or something.

Very - no wait - VERY!!! - frustrated right now. Showstopper.

Hi @wmoecke

The two lines above are in error: you should not have the "&" for a string (char array) and a recent compile time check was put in to enforce this.

This is probably the number one cloud coding error I see here so I am happy to have the check in place.

4 Likes

Fine. Here’s the thing - we’re getting a bit annoyed by these “up-to-the-minute” changes you guys put in place and we have to find out about these just that way.

The documentation you guys put up in the Particle apps reference is so outdated that it doesn’t even mention the change from “Spark.variable” to “Particle.variable”.

This is how microsoft became one of the most hated companies in the world. Just a word of advice…

I’m gettin’ sick of this.

2 Likes

The change there is acutally not there to annoy you but to protect you from yourself!

If you write correct code the error won't pop up, but before this "annoying" feature was introduced, your code would just have built but not functioned correctly.
What do you prefer? Getting told that there is something wrong with your code before deploying it or just think it works but later on have to wonder why it doesn't work?


BTW: Deprecating functions is something that happens in other companies - besides Microsoft - too, and the warnings (if you look at them) do warn you in time.

2 Likes

The change is in the release notes the 0.4.6 release:

  • Compile-time checks for Particle.variable() #619

The check is there because the code you wrote is in fact wrong and won’t work the way you think it might. So I’m sorry for your frustration but this is meant to help you.

If you don’t want your devices automatically updated to the latest version of firmware, you can select a version other than “latest” for your device in the “devices” drawer of the Web IDE.

3 Likes

The ANNOYING thing is that you guys are putting up changes - and these changes affect all cloud users - WITHOUT ANY WARNING and WITHOUT PROPER DOCUMENTATION to give us some clue;

Those cryptic errors that the compiler throwed at me are no help either.

And “Protect me from myself” - please.
I work with software development for over 15 years. Don’t patronize me.

And now to add to the giggles - NONE of my applications is flashing from the cloud.

Wonderful. Just wonderful. Keep going.

ScuffR you really talk like a microsoft guy. I hope that’s not gonna become a trending practice in Particle. Otherwise I see my photon quickly going to the “forgotten” drawer. I’ll switch to an Arduino + ESP8266 quicker than you guys can say “we protect users from themselves”.

@wmoecke Thanks for perfectly named topic. I had this same error in my code (and thanks to you found the fix nearly instantly) I totally get the frustration, although I can’t claim to know exactly what you are dealing with. My old boss would have forbid me to use a platform like the Particle. To me it isn’t a matter of right and wrong. it is managing expectations and assumptions. Particle draws their line a little closer to the edge than most. But I have to admit this platform is letting me do amazing things without much outside help.

If you decide to stick with Particle here are some things I now do that I learned the hardway.

  • Make sure you can compile your application locally for any piece of released code. As you now know the cloud compile can change quickly.
  • Expect there to be bugs with each new feature. I am super stoked about system threads (just released), but I bet there will be some issues. For me though, the Particle team is still better than my team (wait I don’t have a team… oh yeah, thanks Particle I could never do this by myself).
  • Ask the community early and often. I won’t argue a good community is better than good documentation, but it is better than having neither. I would give the community an A+ and the documentation a C-.
3 Likes

No-one is patronizing you, rather reminding you of the fact that you're only human. Unfortunately, humans are far from perfect. We make mistakes, lots of them, regardless of how long you've been doing certain things. Like it or not, having the & was a mistake, even though you've had 15 years of experience. But who cares, everyone makes mistakes.
If at all, you should be mad that these checks have been introduced only now, and not earlier. They help you spot mistakes, and fix code. Does it 'break' your code, if that's what you want to call it? Well, not really. It was already broken, it now just refuses to let you compile it, preventing you from staring in wonder as to why it's not working. Rather than criticizing, consider making suggestions for improvement. That way, it's constructive, and can be put up for consideration in future development.


The entire platform is a work in progress. Unlike Arduino, which has been around for a fair amount of time, the Particle products are relatively new, and lots of improvements are yet to be implemented. I can assure you they're working ridiculously hard to make sure these features get out there as soon as possible. I agree with you that there's lots of room for improvement, but then again, I also know they're working on it as we speak.
That said, even the folks at Particle are only human. Contrary to the tools they provide you, there's no-one to protect them from themselves, so their errors occasionally make it to the public, not unlike the code you've been flashing with ampersands. Unfortunately, that happens from time to time, and it's mostly detected by humans, such as yourself. Again, letting them know in a constructive way is most beneficial to both parties.

You've been referring to @bko and @ScruffR as "you guys", as if we're (the elite) from Particle. We're not. The 'elite' are volunteers, and put in our own time to help out. Unless explicitly stated, nothing which we say is an official Particle statement, and shouldn't be interpreted as such.

2 Likes

You had an error in your code. The error is now detect. Great!

If you wanna switch to ESP8266: Good luck. Mine is not very stable… In face it is very unstable. Getting better, but still unstable.

"I had an error in my code"?

Did I?

Spark.variable()
Expose a variable through the Cloud so that it can be called with GET /v1/devices/{DEVICE_ID}/{VARIABLE}.
Returns a success value - true when the variable was registered.
It is fine to call this function when the cloud is disconnected - the variable
will be registered next time the cloud is connected.
// EXAMPLE USAGE

int analogvalue = 0;
double tempC = 0;
char *message = "my name is particle";

void setup()
{
// variable name max length is 12 characters long
Spark.variable("analogvalue", &analogvalue, INT);
Spark.variable("temp", &tempC, DOUBLE);
if (Spark.variable("mess", message, STRING)==false)
// variable not registered!
pinMode(A0, INPUT);
}

void loop()
{
// Read the analog value of the sensor (TMP36)
analogvalue = analogRead(A0);
//Convert the reading into degree celcius
tempC = (((analogvalue * 3.3)/4095) - 0.5) * 100;
delay(200);
}
Currently, up to 10 cloud variables may be defined and each variable name is limited to a max of 12 characters.
There are three supported data types:
INTDOUBLESTRING (maximum string size is 622 bytes)

Show me.

I know as a programmer, that arrays are the exceptions to the rule of using the addressOf operator. point.
But then again, there's the particle Cloud class, with their own implementations. AND A FAILED DOCUMENTATION, which, up to this day, was accompanied by a forgiving compiler.

So let's play Courthouse:

  • I haven't been advised against the improper use of a C language operator in a CUSTOM FUNCTION IMPLEMENTATION by THE OWNERS OF THE PLATFORM.
    They INSIST in maintaining OUTDATED INFORMATION, they PUSH UPDATES WITH NO PRIOR NOTICE and NO REGARDS TO THEIR ONGOING CUSTOMER BASE.
    And to top it off, THEY BLAME CUSTOMERS for THEIR MISTAKES.

Ah, Microsoft. Oh no, it's not Microsoft. It's PARTICLE.

And about the "ESP8266 not stable" debate - looks like another breed of those endless Microsoft Windows vs Linux discussions.
I can GUARANTEE you: Arduino has more libraries supporting it. Arduino has more STABLE FIRMWARE. Arduino has a WIDER USER-BASE with lots of solutions already to many problems.

I cannot STAND a company that as a startup, does not know the MEANING of HUMBLENESS and be open to CORRECTIVE CRITICISM.

In short: I'm sick of being BS'd around.

BYE!!

Signed,

  • Once a supporter.

Your point being with this?

How would you read this part of your own quote?

but

So, yes you did!

And ...

... nope, Particle does not blame, but some individuals (not affiliated to Particle) try to explain causes and reasons and defend Particle against allegations like "WITHOUT ANY WARNING and WITHOUT PROPER DOCUMENTATION to give us some clue" and "PUSH UPDATES WITH NO PRIOR NOTICE"

Not seeing is no prove of non-existence.

But, well ...

I imagine it is frustrating but your code that you posted clearly shows & in your string variables. The example code from the documentation does not. Speed, agility and continuous deployment is the name of the game these days. In the interest of fairness though, what would you consider to be ample “warning” of impending releases? They document bugs openly, what bugs are being fixed and how is also freely available information. Is an email with release notes sufficient? I won’t get into the vitreal about how anything is better than anything else but the facts are clear here:

  1. the code you posted in fact didn’t/couldn’t work
  2. The example code/documentation reflects the proper way to handle STRING Spark.variables
  3. You have personal control over what firmware is deployed to your devices
  4. Causing a compiler error for known bad implementation is the right thing to do for Particle and the user community
  5. Some increased level of communication is requested but no specific guidance is provided
    a. We can admire the problem or offer solutions, admiring a problem is also know as bitching
  6. No one employed by Particle has weighed in, they are in fact other users who appreciate the effort and feel that your spewing of discontent was disproportionate to the problem
  7. No working was code was harmed in the releasing of this compiler update
    a. I think this is a key factor in determining how aggressive one should be
  8. All platforms release things, break stuff, deprecate, add new goodies and add useless “features”. As a user/community member that would like to see things handled differently or improved upon we should offer concise and constructive advice and be phrased as a request
    a. We are users/customers not board members so our vote is in dollars and if you are just that unhappy then “vote” appropriately but no one on this community or employed by Particle deserves to be blasted or belittled in the manner you have.
4 Likes

Sorry to correct here, yes there was this post

Which seems rather decent to me, or can anybody see something wrong with that.

I certainly can't!

2 Likes

Thanks for the clarification and I agree, it was 100% appropriate. Informative, helpful and on topic!

2 Likes

Bye

I would like to see this from the command line. The WebIDE is awesome to get started, but I migrated locally for version control, library management, etc. I kinda got caught this my pants down when you guys took away the ability to set listen mode flags and "send multicast" function from the application space (for the core). I was migrating to the photon anyway, but it really did put a kink in project.

for me threading is #1. If I had full threading I don't think my code would be so dependent on non-supported features. Thanks for the beta, looking forward to the full release.

This is a godo point, and I've raised having version targeting in Dev and Cli internally.

Can you tell me more? Those flags are internal and not part of our published API. What were you using them for? Do you know you can exit listening mode by calling

WiFi.listen(false)

On an interrupt?

I'd like to hear more about "send multicast"? All the code on the core is still available. It maybe simply a matter of including the appropriate header file.

hmmm… I will double check the “send_multicast” I am chest deep on trying to figure out how to not block in listen mode, or find a workable user case if we need to block in listen mode (photon).

made caught with my pants down was a little strong. My code stopped compiling… So I needed to fix it. It would have made my life easier if I would just type particle compile core firmware fw.bin 0.4.3 (or something). I realize the world doesn’t revolve around my wants, but I am just saying…

Anyway, the flag CONFIGE_WLAN_DONE (or whatever it is named), was the exit listen mode thing, and the function works fine. No biggie.

Going from memory here… the send_multicast() is what you guys do at the end of listen mode on the core, after a successful configure through SmartConfig.
Reason 1: For reasons I never figured out sometimes your mobile app, and my custom app would miss this message the first time (especially after I configured a couple cores in a row). So I would resend it a few times and it improved the success rate.
Reason 2: For our deployment we “own” all the spark cores we send out. We then link the end user (customer) with a sparkcore (in our product) at the time of configuration with a custom app. If the user is sending the WiFi crendentials to a sparkcore, the credentials get to the sparkcore, and the sparkcore successful connects to WiFi and the cloud the SparkCore will never go into listen mode again (duh, why would it). Now if the association of the end user (customer) and the deviceID fails for whatever reason it now becomes impossible for app to get the deviceID (without putting the device back into listen mode, or connecting a cable… I realize these are self imposed constrainsts… but we really wanted a “headless” device so it has no buttons). Anyway, to combat this the app can send out a UDP message saying… “If there is a connected SparkCore on this network, please send me your deviceID”. Originally I wrote a UDP message to send back, but my mobile developer messed that up so I use the send multicast which sends a CoAP message that the mobile dev (for what ever reason) implemented properly. kinda wordy but there it is.

FYI, it may be useful to add a release note that states that uint16_t (and I read a post with uint32_t) don’t work for INT Particle.variables.