-
Working Waterfall Display
03/22/2015 at 19:38 • 0 commentsHad some time this weekend to tweak the vb.net display. Data points are displayed as bitmaps on a working waterfall display. Further code cleanup is required but it's pretty efficient as is. No more crashes after 250 data points are collected and no more flickering as the bitmaps are handled by the form.paint event.
On other fronts, I'm considering these vibration sensors as my active elements. https://www.sparkfun.com/products/9196 I don't know how well they would work though for this application. I will have to order one and try it.
-
There should be a private section on here for notes... sketchbook.io anyone?
03/21/2015 at 17:57 • 0 commentsNot really a log but a place for me to dump some research material.
Below material from: http://www.radartutorial.eu/
Phase-increment CalculatingThe phase shift Δφ between two successive elements is constant and is called phase-increment. How large is this phase shift to reach a certain value of the beam steering?
A linear arrangement by isotropic radiating elements is looked at.
x = d · sin Θs - A radar set works with a wavelength of λ=10 cm.
- The distance between the radiating elements is 15 cm.
- We can neglect the propagation time differences by the feeder.
- The beam steering shall be Θs= 40°.
- Which value shall have to have the phase shifter no. 8 (on the left side) to get this beam steering?
Because of the trigonometrical function we need a calculator anyway: Δφ =(360°·15 cm/10 cm)·sin(40°) = 347.1°.
This means the radiating element no. 8 needs the phase shift value φ8 = 7 · 347.1 = 2429.7°.
On reason of the periodicity of the sine function a phase shifting of n·360° is the same as 0°. Therefore we can as long as deduct 360° till there is a angle between 0° and 360° of the result. We get therefore for the phase shifter number 8 (left corner) a phase shift value of φ8 = 269.7°.A part of this phase shift is realized by the delay in the feeding line yet. -
Higher Resolution Proof of Concept
03/18/2015 at 01:04 • 0 commentsFirst set of sensors arrived yesterday from Amazon. 20 minutes of soldering later they are hooked up to the arduino. I had time over the weekend to write the code to drive them all and return the values in a defined format
sample: [04]:[67]:[45]:[90]\r\n
I also rewrote the vb.net application to handle the data as each line and optimized the code further. I am now working with bitmaps instead of the original hacked together draw a panel and then fill with smaller panels. Should have the benefit of easily marking things on the waterfall and easier to save to a data file. Bandwidth is also a concern with getting fast sample rates rendered on the screen. Using the graphics library will be faster than the original panel solution and has the benefit of being able to stream data into the view.
To reduce testing errors the sensors are mounted linearly on an aluminum square tube, wire lengths are kept to a minimum. Each unit (PCB) is spaced 5cm apart for a very uniform look. Each unit is mounted in parallel with the aluminum tube, which results in a more uniform test. Each sensor is isolated from the aluminum tube with electrical tape which should help reduce any additional resonance from the nearby sensors being transmitted through the backing.
The test unit looks a little unrefined but the test data looks good. With an effective 4 pixel sweep over about 30cm and no additional attempts to compile multiple sweeps or higher sampling or beam shaping, I can discern my desk, coffee cup, and monitor base at about 60 cm distance.
I'm checking with the local college lab if they have a laser vibrometer for more precise testing that will be required once phased beamshaping is implemented.
Additional sr04 sensors from ebay are still on the way. I will need to build some hardware logic to address the 14 sensor array from the uno. I'm concerned the uno may be too slow or rather the combination of uno and sr04 sensors will be to slow to be practical, however as an extended proof of concept I think it will work sufficiently.
These are the transducers I am considering to build into the active array: http://www.ebay.ca/itm/1pc-Ultrasonic-Sensors-Integrated-Transceiver-Waterproof-Diameter-16MM-/191136732745?pt=LH_DefaultDomain_0&hash=item2c80a31649
I am liking their size, availability and that they are pre-waterproofed; less thrilled at the price. They will be embedded in a non resonating material for ease of mounting and consistency. I am currently considering silicon as it has the benefit of being able to remove the transducers from it easily and it should absorb additional vibration from the transducers reducing co-location interference and letting us scan at a faster rate.
Thanks for reading!
-
The Dream
03/14/2015 at 06:00 • 0 comments"It came to me in a dream and I forgot how it worked in another..."
How hard can it be to visualize sound. Really? No? No one on the internet builds this? That's stupid! Screw it, I'll build one. How hard can it be?
And so it begins. I'm not starting this project with nothing to show already I think that's foolish. But I do know that I know almost nothing about audio engineering or electronic design and this is going to be a learning experience for me and my wallet... probably.
I started off by building a vb.net waterfall type project that simply colours squares (pixels) for each sample that it receives via comm port from the Arduino Uno and it's hilariously badly wired sr-04 sensor I pulled off my quadcopter. (Vibrations and ultrasonics apparently don't play nice together)
Some tweaking later on colours and double translation math has a working visualizer for the data streaming from the Arduino. Proof of concept complete. Lets build a small scale prototype to better fleshout the concept and get some better/more usable imagery on the waterfall. 10 sr-04's ordered from Ebay. (Edit: Damn 4 weeks... ordered 3 more from Amazon with prime, they get here Monday)
Now how do we narrow the sample beams down to get better resolution? I mean, I can see when I sweep it past my monitor that the monitor surface is above the wall surface. But I know what I'm looking for. It doesn't do much good if you didn't know what to look for. Wikipedia says that the US Navy's top of the line (in 1960s) swath sonar used 61 beams at 1 degree precision. How did they get 1 degree precision on sonar? Which led me to the joys of phased sound arrays and directional sound with some crossover on rf theory. Great more stuff that is black magic to me... Also for note, a 6 degree beam ultrasonic sensor is nearly $130.00 USD.
Now I'm left wondering until next week how I'm going to trigger ultrasonic pulses close enough together to form the precise wave front that I need to get anywhere near the precision of 1 degree. Modern arrays have a beam resolution of 0.016 degrees!.
I assume building something in hardware is the best way to tackle this. Expensive systems seem to use a flat plate that resembles a solar panel. I assume it's an array of pizeo drivers in a low/non resonating substrate. I don't hope to achive their results, I'll settle for far less. I figure an array of 60 scanning transducers, with an extra one for wave forming/limiting on each end. A linear line of 61 transducers on both sides. (total count is 184 individual transducers fired in sets of 5) Probably means I'll be ordering transducers to buildup a large grid and figuring out some way of either using a few transducers to fire a ping and the bulk of them to listen or figuring out how to multiplex all of them. Probably some fancy function generators that I don't know how to build yet. Amplifiers and Gain control. (My giant book from element 14 has those covered though)
I'll test out multiple pings with a JSON style return on serial and see if the images I get from the waterfall are more recognizable. I know in air they aren't going to be great but after I test that, I'll waterproof the boards and test in a liquid medium to see what I end up with. Post more next week. Thanks for reading!