Tag Archive | Hackerspace

More uWho updates – Can now open folders of images, basic logic for opening IPcameras

This is what my program looks like now! The first 3 work well, with the 4th (IPCams) being worked on as I type. I’m going to look for processing 10 at most, to start with. I’m also needing some sort of pretty feedback for the Open Folder, as I currently have feedback via a qDebug() , which is very much against user interactive good practices.

I’ve my work cut out for me!

Project: https://github.com/jwcrawley/uWho


W00t: Local hackerspace member Steve made IoT Bathroom occupancy light

I’m a member of Bloominglabs here in Bloomington, IN. A good friend of mine, Steve, made an “internet of things” bathroom occupancy light. He’s always doing something awesome with embedded systems, and this is his newest exploit. Not only that, but he brought us a great Instructable to boot!

As for me, I ordered 4 ESP8266’s from Ali Express. They’re 75MHz cpus with 802.11bgn with 2 GPIO pins.. for $3 ! I still do not have them in yet, but I’m already looking towards IoT as well. We’ll see what we can do with cheap, powerful IoT.


Today, members at Bloominglabs called for the “Stuff Room” to be cleaned and purged. This left us members a nifty room better fit for cleaner projects and a kicking soldering kit! Unfortunately, cleaning started at 6pm, and I got off work at 7pm. But open house is tonight (and Every Wednesday night), so the cleaning and demolishing continued until about a half hour ago.

We also have a great deal of *TEMPORARY* project space next to our laser cutter. However, it does seem that our LCD projector is on the fritz. The connect cable is some weird non-standard DVI-look alike port that terminates to USB and SVGA. No clue here, but we’re attempting a fix.

Intel Perceptual Computing Fail

On 2013/10/22, I attended an informal (albeit packed) conference by IUMakes. The conference was featuring a talk and demonstration about up and coming Intel’s Perceptual Computing devices and software stacks.

The demo started with a talk about Makerspaces, Arduino, hacker ethic, Leap Motion, and similar devices. There on, the speaker introudced Intel’s new piece of hardware, Creative Senze3D (for now, Creative Interactive Gesture Camera). Details are somewhat sketchy.

This talk touched on Minority Report, gorilla arms, gestures, hand tracking, and many various problems one would have in using gesture based systems like this. And the speaker showed off Intel’s systems of how it could do depth maps, web camera color overlays over the aforementioned depth map, and microphones.

Then the downsides: It’s ONLY for Windows.I believe I found out why. The demonstrator talked about opening the device and it being a no-no due to lack of laser safety. This comment leads me to believe that this is yet another laser-dot field in the IR domain. The Kinect uses a similar technique, and does work well. However, with FOV of about .5-4 ft, would also indicate that the dot pitch is also much tighter than that of the Kinect. However the BIG reason why it’s Windows only is the API.  Taken from here: http://software.intel.com/en-us/vcsource/tools/perceptual-computing-sdk

~Speech recognition: Just like Google, Intel will keep their speech corpus close to them. There’s no reason why this library couldn’t be used with any arbitrary microphone.

~Hand and Finger Tracking: There’s still no real reason why this requires anything past a webcam. Hand and joint detection is a software problem via OpenCV. Although, having a library abstract it away is rather nice.

~Facial Analysis: This is cool, but this also goes along the lines of ‘OpenCV software problem’. Even my Android phone can be set up to allow a face login… and it also requires me to blink! There’s no structural scanner in that phone: it’s a software problem, solved by software.

~Augmented Reality: Ok.. This is where the project does get cool. As demoed, this can strip out the backgrounds of a video chat, and put in whatever you want. Better yet, it can also allow real-time stripping and recreation of an environment.

~Background Subtraction: This really has everything to do with the above point. Once we have depth data, we can do cool stuff. But it all hinges around an appropriate depth sensor. And the Kinect first gen seems much more reasonable for hackery than this device. They’re only getting better. Don’t break the bank here guys and gals.

There is absolutely no Mac port planned, and Linux is… “questionable”. Oh, And this device is $150, with only an API.

So, what I see here seems to be the newest trend in trying to get the Hackerspaces and makerspaces together. That method is by releasing peripherals that are 80% done, and getting the community, for free, to finish the rest of the hard work. Even the LeapMotion was $90, with shipping and such included. Admittedly, Leap has SDKs for Mac and Windows, with an alpha SDK for Linux. Even they went to release what they had.

After I asked a question about: “Why not put a LiPoly battery and BT 4.0 and treat this as a portable device and use it with Android..?” And was hushed as that was Intel’s next plan for this device. Any enterprising hacker could easily see that; hell, we could do it. And yet, they’re looking at Android. Real Linux support isn’t that far away.

The Intel spokesman talked about their Arduino-clone Pentium, the lack of any staying power in the Maker community, and their dwindling numbers in the Mobile space (4%.. ouch). And this, yet another Wintel project, shows precisely why.