NerdKits - electronics education for a digital generation

You are not logged in. [log in]

NEW: Learning electronics? Ask your questions on the new Electronics Questions & Answers site hosted by CircuitLab.

Microcontroller Programming » Serial Read in Labview

November 18, 2011
by Sefco
Sefco's Avatar

I have the nerdkit connected to Labview via the Serial to USB connector that comes with the nerdkit. I have a VI that uses the VISA serial read to acquire the serial data, which in this case is the temperature output. The system works great except there is a timing issue between the nerdkit and Labview. I can acquire the data, but where the data begins is dependent upon when I start the VI. For Example, right now the output is "72.04 Degrees", but when I read it in Labview it may take the form of "ees72.04 Degr". I'm slowly wading into digital electronics, but it seems like there should be a trigger or a start bit to give Labview a heads up that a Byte is on it's way.

Certainly this happens all the time and there is an easy efficient solution. Thanks ahead of time for your help.

November 18, 2011
by Ralphxyz
Ralphxyz's Avatar

Well the serial communications code of the temperature sensor does not contain a start bit that Labview recognizes you would have to change the code.

Seems like a simple thing to do what does Labview (VI?) use as a start bit?

By the way what is a VI? Not many of us here at Nerdkits are familiar with Labview.

If fact if you went over to AVRfreaks and mentioned Labview the response would be excruciating.

Labview is not viewed very favorable in that AVR community.

Some here on the Nerdkit forum actually use Labview so you should get an actual answer to your question.

Ralph

November 18, 2011
by 6ofhalfdozen
6ofhalfdozen's Avatar

Sefco,

I don't work much with Labview(a tiny bit on an old Mac2e with lab view 1.0), but have worked a decent bit with sensors sending data to computers monitoring serial data streams. The issue you are seeing is fairly common, and there are several ways to deal with it, but I don't know the exactly methods/code for modern labview to do it.. So here are a couple general methods to fix the issue that work for labview or any other software, assuming you can program the code to do it.

  1. use software filtering on exisiting things to pull the data out.. ie look for "Degrees" and then the next four "bits" are data

  2. retweak the NK code to include a data flag in the stream, ie make the data streams read "77777 Degrees XX.XF" and use 77777 to tell the computer to pull the next letters as data

  3. Bring back the RTS/CTS lines, if your serial to USB converter can handle it. This was the original way to handle "data packet coming" messages passing back and forth. a google search of RS232 communication should pull up some explanations of how this works. It is relatively easy to setup the nerdkit to set a pin high when ready to send, but you need to know if your serial/usb can handle it.. and then labview should be able to send back, thought honestly I don't know how the newer versions work and if they still support this.

hopefully that helps a little

November 26, 2011
by pfullen
pfullen's Avatar

Why do some AVR people not like Labview? Just curious and want to learn.

I do not know much about it except that the Lego mindstorm uses it. I have an engineering friend that uses it all the time for work

November 27, 2011
by 6ofhalfdozen
6ofhalfdozen's Avatar

pfullen,

I don't have anything against labview, well, except the high cost. I have heard rumblings from several non-AVR/scientific backgrounded people who feel that labview is a "shortcut" that lets too many people learning to do things without caring about the how's and why's that go behind it. Honestly, I haven't heard too much from AVR folks against labview, except that its expensive and they don't have/know it. but then again, I haven't specifically looked at AVR forums for labview stuff.

And my two cents are that labview could be a nice software side to match with an AVR, but since I have VB, can get the new VB express for FREE, and know VB, that is what I use.

December 01, 2011
by Ralphxyz
Ralphxyz's Avatar

I "think" (not having ever seen or used Labview) that it is just to much resources for programing a 16 kbit mcu.

You are not (generally) going to have a hundred external components, you are going to have a mcu and a transistor or maybe a port expander or shift register or do every thing with the mcu alone.

All you need is a text editor.

Ralph

December 01, 2011
by Sefco
Sefco's Avatar

Hey guys, thank you for the quick responses. I'm always impressed with the depth of knowledge and the willingness to help out in the nerdkits community. Unfortunately I've been busy with thanksgiving and such so I have not had a chance to implement any of your suggestions.

As far as labview goes, I'm more or less using the nerdkit as a poor man's data acquisition board. With the winter months upon us, my current project consists of using the nerdkit to convert the analog temperature voltage to a digital value and sending that to the computer where Labview neatly plots temperature vs. time and saves the values to a file.

To be honest I'm a practicing mechanical engineer and I'd like to stay sharp with Labview in order to fortify my resume. I'm running a copy of Labview Student Edition Express 7.0 that came with one of my textbooks. The nerdkit has allowed me to become much more electrically savvy and I've also learned C along the way. I used to DESPISE the EE portions of my coursework, and now I can't get enough.

I'd like to delve into a discussion on sampling rate. So the crystal is timed at 14,745,600 Hz, but we generally pre-scale by 1/128 for the ADC clock to run at 115.2 kHz which is our serial Baud rate. I'm under the impression that now our effective sampling rate for an 8 bit temperature measurement is 115.2 Khz * 1 Byte / 8 Bits = 14,400 Bytes/sec. Then there is the piece of code that averages every 100 samples before display, so the serial out rate would be 14.4 Bytes/sec to labview. I guess what I'm uncertain about is the ability of the MCU to multitask, as in continuously measure temperature at 14.4 Kb/s, average every 100 samples, write to the LCD, and write the serial out.

Is 14.4 Kb/s an accurate sample rate, or is it something less because of all the other operations that are occurring?

December 02, 2011
by Ralphxyz
Ralphxyz's Avatar

ah ha!!

With the winter months upon us, my current project consists of using the nerdkit to convert the analog temperature voltage to a digital value and sending that to the computer where Labview neatly plots temperature vs. time and saves the values to a file.

That is what "we" use Python for.

Ralph

December 04, 2011
by pfullen
pfullen's Avatar

Ralphxyz

You mentioned Python I was looking to expand my computer programming skills and was looking at the languages that Google Apps supported.

I noticed that Python and Java were the two main programming languages used.

In your opinion would it be better to learn Python or Java. What are the pros v cons

Btw , thanks for all your postings on this forum. It is great to be able to post questions and have people like you with tons of knowledge out there.

December 05, 2011
by Ralphxyz
Ralphxyz's Avatar

Thanks pfullen, you'd be amazed at my lack of knowledge but I try :-)

If you look at the Nerdkit Tutorials you will see examples of using Python to graph data The Nerdkit Strain Gage is my first Python exercise for graphing data.

I have tried repeatedly to learn JAVA, every tutorial I have tried from the web eventually breaks down and does not work with no followup support from the site host.

I have also bought a number of "learning" JAVA books which also eventually breakdown and the samples do not work again with very limited support.

With Python there is always support I am sure there are also various JAVA forums where a noob is tolerated but the Python forums expect you to not know anything.

Humberto has suggested using RUR-PLE to learn Python I have it on my list to actually go through this.

Ralph

December 07, 2011
by GeeBob
GeeBob's Avatar

@Sefco Did you get your VISA Serial coms worked out? I have some experiance with LabVIEW and I might be able to help you sort this out. I really like the idea that you are going to use the Nerdkit as a poor mans DAQ device. @pfullen I don't know why people have such a bad opinion of LabVIEW. I can understand why the AVR folks don't like it as it is expensive and is limited to a specific set of embedded targets. But its strenghts really come into play when you have to control and collect data from instrumentation. @6ofhalfdozen I agree with you using VB. You should use what you are comfortable with. As far as "shortcut": Isn't C a shortcut for assembly? People can write bad programs in any language. LabVIEW, C, C++, Python, VB are all just tools in a toolkit. And we all love more tools right ;) @Ralphxyz I really enjoy reading your posts, whether they are questions or answers. I agree that LabVIEW is to heavy for the 16kbit MCU.

December 21, 2011
by frsp5
frsp5's Avatar

Hi guys, I just got here and am enjoying reading some of the forums. I see no one ever answered Ralph's question. "VI" stands for Virtual Instrument. Example: you plug a National Instruments (NI) A/D an D/A board into your computer, use it to measure voltages, and you can easily "write" a program to emulate an oscilloscope display on your screen, hence virtual instrument. Haven't programmed in Labview in over a decade. Haven't programmed in C in over two decades. But having a son inspired me to show him a few things.

Post a Reply

Please log in to post a reply.

Did you know that binary numbers use base 2 to represent numbers, and these are important for understanding microcontroller registers? Learn more...