USB Connection Problems. Ubuntu 14.04 [Solved: use sudo]

Trying to debug I need to use serial out to a Ubuntu Terminal. But I get connection error.

Using the CLI: gives me

$ spark serial list
Found 1 core(s) connected via serial: 
1:    /dev/ttyACM0

So the core is visible. But then:

$ spark serial monitor 1
Opening serial monitor for com port: "/dev/ttyACM0"
Serial err: Error: Cannot open /dev/ttyACM0
Serial problems, please reconnect the core.

I assume that reconnect means unplug then replug, which also removes power in my case.
I am a member of the dialout group:

sudo usermod -a -G dialout xxxxx
1 Like

how about sudo serial monitor 1?

1 Like

kennethlimcp is correct - this works:

sudo spark serial monitor 1
Opening serial monitor for com port: "/dev/ttyACM0"
2 Likes

:beers: :smiley:

Anyone know why sudo privilege is needed?

1 Like

Good, but there are a few more more steps required to use the Spark CLI without having to use sudo for every command involving serial.

To fix this, run this command in the terminal:

sudo chmod a+rw /dev/ttyACM0

You will then be able to use all of the spark commands without sudo, but however this must be done each time you connect your core.

A better way is to create a rule in /etc/udev/rules.d. This enables the chmod command to be run automatically when your core is connected.
Here are the steps:

sudo nano /etc/udev/rules.d/50-local.rules

(Paste this line)

ACTION=="add", ATTRS{idProduct}=="607d", ATTRS{idVendor}=="1d50", DRIVERS=="usb", RUN+="chmod a+rw /dev/ttyACM0"

Press Control + o and then Control + x to save.

Now whenever you plug in your Spark Core, the chmod command will be run, giving you full access to your spark core, making it so you never have to type in your sudo password when using your spark core again.

Note:
Your device id {idProduct}, and your manufacturer id {idVendor} may be different. To check, run

lsusb

And substitute your results into the /etc/udev/rules.d/50-local.rules file you created.

Bus 003 Device 026: ID 1d50:607d OpenMoko, Inc.

Was my Spark core when I ran lsusb.

When it says, ID 1d50:607d, the first hexadecimal value is the vendor, and the second is the Product ID.

If yours are different, change them in /etc/udev/rules.d/50-local.rules.

1 Like

This is a device file on a unix system, which is a protected resource. Users are generally restricted in their control of these resources.

1 Like

nrobinson2000, thank you for the recommendation. Seems like something that should be included in the spark documentation somewhere.

Still do not understand though why this is needed. I don’t need similar to access usb drives, arduino’s or multiple other devices. In response to aceperry I read that it is not a good thing to run processes using superuser authority as this can leave my linux kernel open to attack. The general advice I see is that Superuser should be limited to maintenance activities where kernel changes are needed.

1 Like

I still need to do this when using my arduino.

@nrobinson2000,

my guess would be that for Arduino in general on linux machines, the USB serial port appears as /dev/ttyUSB0

However for spark devices, it’s appearing as /dev/ttyACM0 which might be requiring the sudo privilege.

I’m saying this cos on another development board that i tried tinkering with, there’s some issues even on Mac OSx where we have to mess with linking the USB profile to another alias in order to run it properly.

A lot of other packages come with rpm install scripts that will do things like add udev rules to change the permissions on USB devices they want to talk to.

What you hear is correct, as a rule you want to avoid running packages under sudo, because it will result in painful things like files/directories created in your $HOME that you cannot access, overwrite or delete.

The correct fix is adding and maintaining udev rules for the spark VID/PIDs, and automatically installing them as part of the install process. If this is supposed to happen already then it is broken, if not then it should be made to happen.

2 Likes

Yup so there’s no default behavior/implementation for this.

Maybe @mdma can take care of this :wink:

Great idea! I’d love if we could spend some time this summer and improve the install experience, I created a task for the cli linux bit here - https://github.com/spark/spark-cli/issues/184

Thanks,
David

2 Likes

I’m using Linux Mint 17.1, and my arduino shows up as /dev/ttyACMx

1 Like

Ditto Lubuntu 14.10 where Spark Core shows up (in the Arduino IDE) as /dev/ttyACMx. This is probably the case for all Ubuntu derivative installs.

Another location choice that appears in the IDE is /dev/ttySn.

1 Like

I have not tried this but supposedly this is a group problem:

1 Like

I have this issue, in my MacOSX, I get “No devices available via serial”. I’ve tried connecting the USB in different sockets, but is still the same. How did you do that - linking profile - that you mention. ?

Which OSx version are you on? We rarely see issues with Mac. Can you try another can and use particle serial list? Also, make sure the device is in Listening mode.

Yosemite 10.10.4.
Node v0.12.4

My device is an original CORE. I just can’t assign the Wifi Credentials using my phone, last year it was working fine with my phone… but now… it just keeps blinking blue. Thats why I want to use particle-cli. but “! serial: No devices available via serial” is all I get.

  • Is the device blinking blue?
  • Did you try changing a cable?
  • Try ls /dev/tty.* and ls /dev/cu.*