I think you mean, the AC to AC transformer is just to give the Spark Core a way to measure the AC voltage and determine the phase relative to the current waveform. You should be able to power your Spark Core from a separate 5V micro USB wall adapter.
@BDub Hey man I have the 30Amp Current clamp now. Would it be easy for you to tell me how to connect this clamp to the Spark Coreās input pins? And then what code I would need to run to get a accurate current reading?
I see others have just taken the current reading and times it via a set 120v AC to get wattage readings. That would probably be fine for me for now.
I looked over the code on the Open Energy Monitor site and read the info but its going to take more time to really understand whats going on.
Data sheet for SCT series current clamps good old Google!
Wire to your Spark as per my schematic. If you only want current just use the parts connected to Analog In 1. That is one side of the current clamp to A1 the other side to 1.65 volts obtained by putting a 10k resistor from gnd to the 1.65v rail, a 10k resistor from 3*3 to the 1.65v rail. Remember the 0.1uF decoupling capacitor between A1 and ground and a bigger one say 10uF between the 1.65v rail and gnd.
As the Open Energy Monitor web site explains, the ADC accepts 0 - 3.3v while the current clamp delivers 1v ac (at 30amps) so by connecting the clamp between the 1.65v rail you have just constructed and A1, the analog pin sees 1.65 Ā± 1v = 0.65 to 2.65 volts which is nicely in the middle of the ADC range. (1v ac from the clamp may be 1.4v peak? and so the voltage delivered to the Spark for 30 amps along the mains cable is 0.25v to 3.05v - the thing to do is check your voltage before applying to the Spark bearing in mind that the peak is 1.4 times the RMS for a sinusoidal signal).
p.s. For current only measurements you call emon1.calcIrms() rather than emon1.calcVI()
I havenāt used that bit of code myself. Iāve looked it over and added a 250uS delay - Iāve just updated this in the Gist files - so it should work just as it does with the Arduino examples. When you call it you will need about 10 cycles worth of data to get a decent rms current. Because zero crossings are not a reliable measure with current measurements you call the function with the number of samples rather than the number of cycles as you do with the calcVI().
1 sample takes 250 delay + 45 ADC time + about 5uS sums = 300uS so 10 cycles at 50 Hz needs 10/50 seconds of data = 0.2/0.0003 = 667 samples.
So in your code you have:
EnergyMonitor emon1; // in the variables declaration
float myRMScurrent;
emon1.current(1,30); // in setup() use 30 for a SCT-013-030 or 60 for a SCT-013-060
// the 060 clamp is for heavy energy users
myRMScurrent = emon1.calcIrms(667); //in loop()
Why current zero crossings are not a reliable measure of the number of mains cycles
ā¦
The green line is my measured current when the amount I generate is close to the amount I am using.
@kareem613
That looks similar to the ac supply I use. 500mA is more than enough as the only load is a 100k and 10k resistor in series. 9v across 110k ohms is 80uA. Check the output voltage before you attach to the Spark - cheap supplies donāt always deliver what they say especially at low loads. If the voltage across the 10k resistor is higher than 1v measured with a voltmeter on its ac setting then put a lower value resistor in place of the 10k one.
If the supply is 9volts then the voltage across the 10k resistor which is what you measure with A0 is 0.8 volts 9v*10k/(10k+100k) = 0.8v = 1.15v peak to peak which with the 1.65v reference gives you 0.5v to 2.8v on the ADC.
@BDub is completely right about what the 9v supply is for. The ac power supply is only for a voltage reference and has nothing to do with powering the Spark. The Spark must have a DC supply!
Just updating the major parts list as a summary.
- Current sensor: SCT 013-030
- Reed switch: COM-10601
- Photoresitor: Cds Photoresistor
- AC/AC power supply: EPA090050-S/T-SZ
I found an AC supply at digikey with bare wire leads. They also have a 6VAC. Still not low enough to directly connect to the sparkcore. Think that might be better anyways?
@kareem613
I couldnāt follow the power supply link but I doubt thereās much to choose between the power supplies. If your software development goes like mine you will spend a lot of time unplugging the Spark from close to your meters to reprogram it. I have a 10pin header with a ribbon cable to a connector block.That way I just have to disconnect the header to move the Spark and Iām sure to get all the right things in the right places when I reconnect. The current sensor is SCT-013-030.
The Open Energy Monitor project put everything in a nice box so if you are planning to do that then the supply with a plug would be neater. In my case the leads wouldnāt have been long enough so I have everything plugged into flying leads going into the connector block.
I see from my photo that Iāve actually used a 20uF capacitor on the 1.65v line, I hadnāt noticed that before. The exact value is unimportant.
Good idea with the ribbon cable. Makes the board much tidier.
Iām going to dive into this over the weekend probably. Way to much going on right now to try to do this but I am looking forward to get this up and running with LCD display and online data logging.
Interface Directly between Spark and emoncms
@RWB, @Hootie81, @kennethlimcp, @BDub, @kareem613
Now I have a spare core Iāve revisited this project. If you like you can ditch all my Raspberry Pi and Arduino code and communicate directly between the Spark core and a PC running OpenEnergyMonitorās emoncms ( which I found a pig to set up - php and mySQL have been substantially revised since they wrote their instructions) anyway all that is needed at the Spark end is to send the data to emoncms using Sparks TCP client library. The following code sends a label:value pair to emoncms, updating the value every 10 seconds. emoncms then looks after all your data management and plotting requirements ā¦
//check whether Spark will upload data to emoncms
//16 May 2014
TCPClient client;
int ledPin = D7; //light LED when data sent successfully
int i=0; //dummy data
void sendData(int i){
client.connect("192.168.1.113",80); //the address of the PC with emoncms running
delay(500);
if (client.connected()) {
client.flush(); //clear any rubbish
client.print("GET, /emoncms/input/post.json?json={power:");
client.print(i); //insert power value (obtained from a call to emonlib in the code I posted previously)
client.println("}&apikey=put your emoncms key here");
delay(500); //wait for response
if (client.available()){
if (client.read()=='o') digitalWrite(ledPin, HIGH); //set LED high if client responded (should send "ok")
}
client.print("Host: ");
client.println("192.168.1.113");
client.println("Connection: close");
client.println();
client.flush();
delay(400);
client.stop();
}
else {
client.flush();
client.stop();
digitalWrite(ledPin, LOW);
}
}
void setup() {
pinMode(ledPin, OUTPUT);
}
void loop() {
digitalWrite(ledPin, LOW);
sendData(i++);
delay(10000);
}
I pinched most of the TCP code from various comments people have made in this community. It will be clear that I donāt really know what Iām doing, so any improvements to this code snippet would be most welcome.
The main point is that if you want to do Open Energy Monitor stuff, you can replace emonTx with a Spark, a photo resistor or current clamp, an optional ac supply and a couple of capacitors - magic - well done Spark team.
@phec So I can get a better visual understanding of your new setup can you upload a picture of the spark and your sensors?
You saying that you can just ditch the EmonTX and use the Spark at wifi enabled replacement. Which is what you were doing before right? You just update the code you were using right?
So can transmit the sensor data every 10 seconds to a PC running the EMC software. Can the Spark just send that sensor data to the Web Based EMC software interface here: http://emoncms.org if somebody does not want to use a local PC to receive the data?
Thatās right it is just a software change. The Spark is as in the photo posted earlier. The CT sensor, photoresistors, reed switch and low voltage ac supply are connected to the screw terminals as labelled in the photo.
Iāve not used the web service emoncms but it should work exactly the same. You would just change the address of the client from //192.168ā¦ to //emoncms.org
When I have a bit more time Iāll write an emoncms only version of the spark code. It will be quite a bit shorter than the UCF version but wonāt include the waveform.
@phec A version just for the EMONCMS.Org site would be awesome.
I can see myself using this with our larger portable solar generator systems that would be near WiFi access so they could upload the system performance data directly to the web to track the systems performance via the web from anywhere in the world.
Here you go - I said it would be much shorter.
The Spark emon library is identical to before.
// This #include statement was automatically added by the Spark IDE.
#include "SemonLib20.h"
//16 May 2014
//Collect current clamp, photoresistor and reed switch energy measurements
//and send to emoncms
TCPClient client;
EnergyMonitor emon1;
int ledPin = D7;
int i=0;
unsigned long int currentTime, previousPoll;
const long FLASHKWH = 3600; // 1 flash per sec is this many watts
const float TICKKWH = 400000.0; // 1 gas switch per sec is this many watts
const int NLOOPGAS = 20; // check gas every few loops 5 minutes for 15sec query
const int SAMPLEINTERVAL = 10000; //time between samples
int gasCount = 0; // count number of times round loop since last gas update
const long DEBOUNCE = 200;
int gasPin = D0;
int genPin = D1;
int expPin = D2;
volatile unsigned long lastGas; //time since last flash for debounce
volatile unsigned long lastGen;
volatile unsigned long lastExp;
volatile int nGas = 0; //number of flashes
volatile int nGen = 0;
volatile int nExp = 0;
float powerGas; //power values
float powerGen;
float powerExp;
void sendData(float Pgas, float Pgen, float Pexp){
client.connect("192.168.1.113",80);//your PC address or emoncms.org
delay(500);
if (client.connected()) {
client.flush(); //clear any rubbish
client.print("GET, /emoncms/input/post.json?node=1&csv=");
client.print(Pgas);
client.print(",");
client.print(Pgen);
client.print(",");
client.print(Pexp);
client.println("&apikey=your emoncms apikey");
delay(500);
if (client.available()){
if (client.read()=='o') digitalWrite(ledPin, HIGH); //set LED high if client responded (should send ok)
}
client.print("Host: ");
client.println("192.168.1.113");
client.println("Connection: close");
client.println();
client.flush();
delay(400);
client.stop();
}
else {
client.flush();
client.stop();
digitalWrite(ledPin, LOW);
}
}
//interrupt routines
void gasInt() {
unsigned long thisTime;
thisTime = millis();
if ((thisTime - lastGas) > DEBOUNCE) {
lastGas = thisTime;
nGas++;
}
}
void genInt() {
unsigned long thisTime;
thisTime = millis();
if ((thisTime - lastGen) > DEBOUNCE) {
lastGen = thisTime;
nGen++;
}
}
void expInt() {
unsigned long thisTime;
thisTime = millis();
if ((thisTime - lastExp) > DEBOUNCE) {
lastExp = thisTime;
nExp++;
}
}
void setup() {
pinMode(ledPin, OUTPUT);
previousPoll = millis();
emon1.voltage(0, 250.0, 2.0); //initialise emon with pin, Vcal and phase
emon1.current(1, 30); //pin, Ical correct at 1kW
pinMode(gasPin, INPUT);
pinMode(genPin, INPUT);
pinMode(expPin, INPUT);
attachInterrupt(gasPin, gasInt, RISING);
attachInterrupt(genPin, genInt, RISING);
attachInterrupt(expPin, expInt, RISING);
lastGas = previousPoll;
lastGen = previousPoll;
lastExp = previousPoll;
}
void loop() {
digitalWrite(ledPin, LOW);
currentTime = millis();
if (currentTime > previousPoll+SAMPLEINTERVAL){ //report data every 10 seconds
previousPoll = currentTime;
//update data from emonLib
emon1.calcVI(20, 1600);
// we're not using nFlash as we have emon.realPower available
nExp = 0; //reset interrupt counter
// now deal with PV meter flashes
powerGen = (float) FLASHKWH * nGen / (1.0 * SAMPLEINTERVAL);
nGen = 0;
// now deal with gas ticks of the reed switch
// only update gas every NLOOPGAS loops (20 = 5min as ticks are slow
gasCount++;
if (gasCount == NLOOPGAS) {
gasCount = 0;
powerGas = TICKKWH * nGas / (1.0 * NLOOPGAS * SAMPLEINTERVAL);
nGas = 0;
if (powerGas > 40) {//trap chatter if meter stops mid switch
powerGas = 0;
}
} //end of slow gas calculation
sendData(powerGas,powerGen,emon1.realPower);//send data to emoncms
}
}
Iāve checked this runs on a Spark and transfers data to emoncms on a PC. Iāve not tested it extensively as I donāt want to disconnect the original system for long.
Im very new to openenergy, but love the concept.
How can I use my spark, but not use the raspberry pi?
I have mains voltage and 2 current IC outputs (sine waves)+ temperature on the spark already
@kiwibird1 yes there are two ways that you can use the Spark without a Raspberry Pi. The simplest is to use the OpenEnergyMonitor projectās EmonCMS data logging and display. This can be installed on either a PC or Linux computer or you can register for the OpenEnergyMonitor projectās web based service. EmonCMS
Post 54 on this thread gives the Spark code to upload to EmonCMS. @RWB is looking into this approach so you could share experiences.
The alternative if you want to have a stand alone system is to attach a display to the Spark. I havenāt tried this but there is plenty of stuff on this forum about attaching displays. Displays - What works, What doesnāt Given that all the information is right there on the Spark there are all kinds of possibilities for displaying instantaneous and cumulative energy use and there are plenty of unused analog inputs that you could use to monitor indoor and outdoor temperature too - though without another computer youād need to keep an eye on memory use.
Hello,
First of all I want to give you my congrats for your project. I am trying to do a energymonitor with two currents and one voltage with a sparkcore.
I am compiling and uploading locally using Netbeans. I have tried other codes before with no problem. No, with yours I am receiving some error similar to this:
āundefined reference to `EnergyMonitor::calcVI(int, unsigned int)āā
I have included in my application.c file:
include "SemonLib20.h"
include āapplication.hā
I have also included in the folder core-firmware/src the file SemonLib20.c and in the folder core-firmware/inc the file SemonLib20.h
It could be a makefile error??? Can anybody help me, please???
Javier, (@javier_pelaez),
The code in my example is C++ not C.
I donāt touch application.cpp but include the Spark Energy Monitor files in addition to application.cpp
You correctly put SemonLib20.h into core-firmware.inc,
SemonLib20.cpp and Senergy20.cpp go into core-firmware.src
(There isnāt a header file for Senergy20.cpp)
My Make file for Netbeans looks like this - ignore the files which are commented out - these are for other projects.
# This file is a makefile included from the top level makefile which
# defines the sources built for the target.
# Define the prefix to this directory.
# Note: The name must be unique within this build and should be
#Ā Ā Ā Ā Ā Ā based on the root of the project
TARGET_SRC_PATH = src
# Add include to all objects built for this target
INCLUDE_DIRS += inc
# C source files included in this build.
CSRC +=
# C++ source files included in this build.
#CPPSRC += $(TARGET_SRC_PATH)/PCFFTsimple.cpp
#CPPSRC += $(TARGET_SRC_PATH)/BeeMon13.cpp
CPPSRC += $(TARGET_SRC_PATH)/SemonLib20.cpp
CPPSRC += $(TARGET_SRC_PATH)/Senergy20.cpp
#CPPSRC += $(TARGET_SRC_PATH)/testEmon1.cpp
CPPSRC += $(TARGET_SRC_PATH)/application.cpp
CPPSRC += $(TARGET_SRC_PATH)/main.cpp
CPPSRC += $(TARGET_SRC_PATH)/newlib_stubs.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_utilities.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_eeprom.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_i2c.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_interrupts.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_ipaddress.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_network.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_print.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_servo.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_spi.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_stream.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_string.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_tcpclient.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_tcpserver.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_time.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_tone.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_udp.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_usartserial.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_usbserial.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wiring_wifi.cpp
CPPSRC += $(TARGET_SRC_PATH)/spark_wlan.cpp
CPPSRC += $(TARGET_SRC_PATH)/stm32_it.cpp
CPPSRC += $(TARGET_SRC_PATH)/usb_desc.cpp
CPPSRC += $(TARGET_SRC_PATH)/usb_endp.cpp
CPPSRC += $(TARGET_SRC_PATH)/usb_istr.cpp
CPPSRC += $(TARGET_SRC_PATH)/usb_prop.cpp
CPPSRC += $(TARGET_SRC_PATH)/wifi_credentials_reader.cpp
# ASM source files included in this build.
ASRC +=
This should do it. If it doesnāt, it could be that Iāve made a mistake uploading the code to GitHub - let me know.
My makefile is quite different from yoursā¦
# Define the compiler/tools prefix
GCC_PREFIX = arm-none-eabi-
# Define tools
CC = $(GCC_PREFIX)gcc
CPP = $(GCC_PREFIX)g++
AR = $(GCC_PREFIX)ar
OBJCOPY = $(GCC_PREFIX)objcopy
SIZE = $(GCC_PREFIX)size
DFU = dfu-util
CURL = curl
RM = rm -f
MKDIR = mkdir -p
# URL to invoke cloud flashing
CLOUD_FLASH_URL = https://api.spark.io/v1/devices/$(SPARK_CORE_ID)\?access_token=$(SPARK_ACCESS_TOKEN)
# Recursive wildcard function
rwildcard = $(wildcard $1$2) $(foreach d,$(wildcard $1*),$(call rwildcard,$d/,$2))
# Define the build path, this is where all of the dependancies and
# object files will be placed.
# Note: Currently set to <project>/build/obj directory and set relative to
# the dir which makefile is invoked. If the makefile is moved to the project
# root, BUILD_PATH = build can be used to store the build products in
# the build directory.
BUILD_PATH = obj
# Path to the root of source files, in this case root of the project to
# inclue ../src and ../lib dirs.
# Note: Consider relocating source files in lib to src, or build a
# seperate library.
SRC_PATH = ..
# Target this makefile is building.
TARGET = core-firmware
# Find all build.mk makefiles in each source directory in the src tree.
SRC_MAKEFILES := $(call rwildcard,$(SRC_PATH)/,build.mk)
# Include all build.mk defines source files.
include $(SRC_MAKEFILES)
# Paths to dependant projects, referenced from root of this project
LIB_CORE_COMMON_PATH = ../core-common-lib
LIB_CORE_COMMUNICATION_PATH = ../core-communication-lib
LIB_CORE_LIBRARIES_PATH = libraries/
# Include directoris for optional "libraries" (eg. Serial2.h)
INCLUDE_DIRS += $(LIB_CORE_LIBRARIES_PATH)/Serial2
# Additional include directories, applied to objects built for this target.
INCLUDE_DIRS += $(LIB_CORE_COMMON_PATH)/CMSIS/Include
INCLUDE_DIRS += $(LIB_CORE_COMMON_PATH)/CMSIS/Device/ST/STM32F10x/Include
INCLUDE_DIRS += $(LIB_CORE_COMMON_PATH)/STM32F10x_StdPeriph_Driver/inc
INCLUDE_DIRS += $(LIB_CORE_COMMON_PATH)/STM32_USB-FS-Device_Driver/inc
INCLUDE_DIRS += $(LIB_CORE_COMMON_PATH)/CC3000_Host_Driver
INCLUDE_DIRS += $(LIB_CORE_COMMON_PATH)/SPARK_Firmware_Driver/inc
INCLUDE_DIRS += $(LIB_CORE_COMMUNICATION_PATH)/lib/tropicssl/include
INCLUDE_DIRS += $(LIB_CORE_COMMUNICATION_PATH)/src
# Compiler flags
CFLAGS = -g3 -gdwarf-2 -Os -mcpu=cortex-m3 -mthumb
CFLAGS += $(patsubst %,-I$(SRC_PATH)/%,$(INCLUDE_DIRS)) -I.
CFLAGS += -ffunction-sections -Wall -fmessage-length=0
# Generate dependancy files automatically.
CFLAGS += -MD -MP -MF $@.d
# Target specific defines
CFLAGS += -DUSE_STDPERIPH_DRIVER
CFLAGS += -DSTM32F10X_MD
CFLAGS += -DDFU_BUILD_ENABLE
CFLAGS += -DSPARK=1
ifeq ("$(USE_SWD_JTAG)","y")
CFLAGS += -DUSE_SWD_JTAG
endif
ifeq ("$(DEBUG_BUILD)","y")
CFLAGS += -DDEBUG_BUILD
else
CFLAGS += -DRELEASE_BUILD
endif
# C++ specific flags
CPPFLAGS = -fno-exceptions -fno-rtti
# Linker flags
LDFLAGS += -T../linker/linker_stm32f10x_md_dfu.ld -nostartfiles -Xlinker --gc-sections
LDFLAGS += -L$(SRC_PATH)/$(LIB_CORE_COMMON_PATH)/build -lcore-common-lib
LDFLAGS += -L$(SRC_PATH)/$(LIB_CORE_COMMUNICATION_PATH)/build -lcore-communication-lib
LDFLAGS += -Wl,-Map,$(TARGET).map
LDFLAGS += --specs=nano.specs -lc -lnosys
LDFLAGS += -u _printf_float
# Assember flags
ASFLAGS = -g3 -gdwarf-2 -mcpu=cortex-m3 -mthumb
ASFLAGS += -x assembler-with-cpp -fmessage-length=0
# Collect all object and dep files
ALLOBJ += $(addprefix $(BUILD_PATH)/, $(CSRC:.c=.o))
ALLOBJ += $(addprefix $(BUILD_PATH)/, $(CPPSRC:.cpp=.o))
ALLOBJ += $(addprefix $(BUILD_PATH)/, $(ASRC:.S=.o))
ALLDEPS += $(addprefix $(BUILD_PATH)/, $(CSRC:.c=.o.d))
ALLDEPS += $(addprefix $(BUILD_PATH)/, $(CPPSRC:.cpp=.o.d))
ALLDEPS += $(addprefix $(BUILD_PATH)/, $(ASRC:.S=.o.d))
# All Target
all: elf bin hex size
elf: $(TARGET).elf
bin: $(TARGET).bin
hex: $(TARGET).hex
# Program the core using dfu-util. The core should have been placed
# in bootloader mode before invoking 'make program-dfu'
program-dfu: $(TARGET).bin
@echo Flashing using dfu:
$(DFU) -d 1d50:607f -a 0 -s 0x08005000:leave -D $<
# Program the core using the cloud. SPARK_CORE_ID and SPARK_ACCESS_TOKEN must
# have been defined in the environment before invoking 'make program-cloud'
program-cloud: $(TARGET).bin
@echo Flashing using cloud API, CORE_ID=$(SPARK_CORE_ID):
$(CURL) -X PUT -F file=@$< -F file_type=binary $(CLOUD_FLASH_URL)
# Display size
size: $(TARGET).elf
@echo Invoking: ARM GNU Print Size
$(SIZE) --format=berkeley $<
@echo
# Create a hex file from ELF file
%.hex : %.elf
@echo Invoking: ARM GNU Create Flash Image
$(OBJCOPY) -O ihex $< $@
@echo
# Create a bin file from ELF file
%.bin : %.elf
@echo Invoking: ARM GNU Create Flash Image
$(OBJCOPY) -O binary $< $@
@echo
$(TARGET).elf : check_external_deps $(ALLOBJ)
@echo Building target: $@
@echo Invoking: ARM GCC C++ Linker
$(CPP) $(CFLAGS) $(ALLOBJ) --output $@ $(LDFLAGS)
@echo
# Check for external dependancies are up to date
# Note: Since this makefile has no knowledge of depenancies for
# the external libs, make must be called on the libs for
# every build. Targets which depend directly on this recipie
# will then always be rebuilt, ie. $(TARGET).elf
check_external_deps:
@echo Building core-common-lib
@$(MAKE) -C $(SRC_PATH)/$(LIB_CORE_COMMON_PATH)/build --no-print-directory
@echo
@echo Building core-communication-lib
@$(MAKE) -C $(SRC_PATH)/$(LIB_CORE_COMMUNICATION_PATH)/build --no-print-directory
@echo
# Tool invocations
# C compiler to build .o from .c in $(BUILD_DIR)
$(BUILD_PATH)/%.o : $(SRC_PATH)/%.c
@echo Building file: $<
@echo Invoking: ARM GCC C Compiler
$(MKDIR) $(dir $@)
$(CC) $(CFLAGS) -c -o $@ $<
@echo
# Assember to build .o from .S in $(BUILD_DIR)
$(BUILD_PATH)/%.o : $(SRC_PATH)/%.S
@echo Building file: $<
@echo Invoking: ARM GCC Assember
$(MKDIR) $(dir $@)
$(CC) $(ASFLAGS) -c -o $@ $<
@echo
# CPP compiler to build .o from .cpp in $(BUILD_DIR)
# Note: Calls standard $(CC) - gcc will invoke g++ as appropriate
$(BUILD_PATH)/%.o : $(SRC_PATH)/%.cpp
@echo Building file: $<
@echo Invoking: ARM GCC CPP Compiler
$(MKDIR) $(dir $@)
$(CC) $(CFLAGS) $(CPPFLAGS) -c -o $@ $<
@echo
# Other Targets
clean:
$(RM) $(ALLOBJ) $(ALLDEPS) $(TARGET).elf $(TARGET).bin $(TARGET).hex $(TARGET).map
@echo
@echo Clean core-common-lib
# Should clean invoke clean on the dependant libs as well? Sure..
@$(MAKE) -C $(SRC_PATH)/$(LIB_CORE_COMMON_PATH)/build clean --no-print-directory
@echo
@echo Clean core-communication-lib
@$(MA`indent preformatted text by 4 spaces`KE) -C $(SRC_PATH)/$(LIB_CORE_COMMUNICATION_PATH)/build clean --no-print-directory
@echo
.PHONY: all clean check_external_deps elf bin hex size program-dfu program-cloud
.SECONDARY:
#Include auto generated dependancy files
-include $(ALLDEPS)
I think @phec was meaning the build.mk file in the core-firmware src folder not the actual makefileā¦
I still donāt know what happened to the current clamps i ordered months and months ago to build this project! they never turned upā¦ ill have to chase them down because this is still one of the most awesome projects!