Yeah that's me, October of 1991, with an IR-based head-mounted eye-tracker. Today, near-IR is used as a DMS [Driver Monitoring System].
A few weeks ago, Keiran Shelden @Keiran_shelden of the Tech Breakfast Podcast introduced me to Stephen Foskett from Gestalt IT. Stephen has been running Tech Field Day events for many years, yet I was a bit surprised that the name of this particular upcoming event was AI Field Day 3. What do I know about ML (Machine Learning) and AI (Artificial Intelligence)? Well, I do know a little, at least in the context how autonomous driving works. It's an industry I've been following for many years, a possibility that I've been pondering since my days in (unfinished) grad school. Back in 1993, I had the opportunity to use IR-based eye tracking equipment to see what part of a computer monitor a subject's eyes were looking at, and I clearly remember imagining how cool it would be to see where a driver's eyes look while driving. Why can't cars watch the roadways with us, helping us more safely and autonomously navigate our roadways too?
The idea for this AI Field Day was for me to participate as a delegate (panelist), enjoying discussions with a variety of companies who sponsor the event, after they present their AI-related technologies to us. Participation sounded fun to me, so I requested those 2 days off, then let Stephen know I was in, all-in.
Soon after day 1 of the live-streamed event, I got to experience the off camera part. Right after each vendor pitch, the live stream is turned off. Delegates then take turns giving direct feedback to the presenter(s), critiquing their work and their content, helping them further master their craft. At typical IT events, such blunt but constructive feedback is rather rare. This is the truly unique part of the experience that I could easily imagine gaining tremendous value from, as an occasional, rather rough-around-the-edges public speaker myself.
The even more appealing aspect of this AI Field Day for me personally was the chance to enjoy open discussions with other delegates about AI-related topics. Gladly, videos of these discussions were published today too. While everybody said stuff far smarter than what I came up, at least I tried.
The diverse delegates were largely from an IT and/or IT Storage background, with organizers flying eight delegates in to be in the temporary studio in a Santa Clara CA hotel room, and five delegates participating virtually using Zoom from around the globe. Their aimed (by human) camera set up worked well, ensuring each of us could see whoever was speaking quite easily. You can read all about the sponsors and the delegates here.
That's me on the big screen, wow!
It was an extra bonus for me that I already knew Karen Lopez @datachick from John Mark Troyer's really fun Tech Reckoning conference back in 2015, and I also sort of know Joep Piscaer from various articles I wrote featuring his work, including his creative use GPUs and USB devices connected to VMware ESXi VMs. It was also really fun to hear Shimon Ben-David present for WEKA, since he and I both worked on IBM XIV Storage while at IBM. Small world!
I'm so glad I got to enjoy learning lots of new-to-me concepts and technologies from NetApp, Intel, WEKA, and ddn during these two packed-agenda days. I also had some Twitter fun with the images from the Intel presentation, especially since they dove a bit further into the hardware specifics, which is kind of my thing.
Any fears or concerns I had going into this new experience were unfounded, and I'd highly recommend this experience to any IT Professional who is up for something like his. The entire Gestalt IT crew was an absolute pleasure and very professional, with my special thanks owed to Emily Scafidi, Rachel Fritz, Matt Garvin, Megan Gordan, Andrew Blackburn, and of course to both Stephen Foskett and Keiran Shelden for making this happen. Also, thank you for the swag that arrived at my doorstep yesterday, a rather nice touch.
Despite my attempt to frame the autonomous driving discussion in terms of increasing safety, before long, other delegates were quick to steer the conversation clear from driving autonomy. I get it, perhaps the subject was just too polarizing for their taste, or too difficult to be able to discuss without devolving into something less-than-constructive.
That said, I would still like to point out that the first thing that springs to most folks when the topic of AI comes up is self-driving cars, even though driver-less cars don't yet exist, at least not in widespread deployments. Gladly, I have been enjoying an earned front-row seat to semi-autonomous driving beta software in my daily driver EV, my 2018 Model 3. I've found it to be quite wonderful to get to witness its growing maturity since I first tested the #FSDBeta in October of 2021.
Paul Braren and his co-pilot wife, on their first FSDBeta drive together on an empty road, Oct 29 2021. Extreme caution taken, with 4 human eyes and 8 computer eyes on the lookout for any reason to intervene at any time. She will not let me or our car put us in any jeopardy, so I took over often. We also avoid irritating any other drivers.
The idea for this incorrectly named "Full Self Driving" optional feature is for the massive real-world data set to be used for machine learning, which allows Tesla to develop improved firmware that is then pushed out to the fleet's water-cooled, dual-processor, ~72 watt "Full Self Driving" computer, using a proven Over-The-Air update process via WiFi or cellular. This dashboard computer (called FSD Hardware 3 or HW3) processes the 8 camera feeds locally, and acts accordingly based on the software code it's running. When manually engaged, it can control acceleration, braking, and steering, but the driver is easily able to take over completely at any moment. The driver must also stay attentive or the system disengages, and all participants in the beta are made very aware that it's the human responsible for the driving, not the car. Currently, the objects in the data are labeled by humans, many of them in Buffalo NY, as explained at Electrek:
The automaker listed some of the responsibilities of a data labeler:
You will use the Autopilot labeling interface to label images critical to training our deep neural networks.
You will interact with the computer vision engineers on the Autopilot team to help us improve on the design of an efficient labeling interface.
You will be expected to gain basic computer vision and machine learning knowledge to better understand how the labels are used by our learning algorithms, as this will allow you to make more judgement calls on difficult edge cases that might come up during labeling.
Eventually, plans are for them to be auto-labeled by the Dojo Supercomputer.
I hope you'll enjoy viewing some of the videos from #AIFD3 that I've collected for you below, and as always, feel free to leave a comment for the Tech Field Day crew below the videos, and/or comments for me below this article. There are excellent long-form articles about the event's AI discussions below that you'll want to check out too.
In this special Field Day Roundtable discussion, Stephen Foskett asks the AI Field Day 3 delegates whether it is possible to create a truly un-biased AI. Given that machine learning is taking over so many decisions in our lives, how should we train ML to avoid bias? Or is bias inevitable given that it uses human experience as an input? And what is bias really?
Tech Field Day - AI Field Day 3 - Is it Possible to Create a Truly Unbiased AI? An AI Field Day RoundtableSame video, but cued to the start of my thoughts about participating in #FSDBeta
In this special Field Day Roundtable discussion, Stephen Foskett asks the AI Field Day 3 delegates to imagine the future of AI. What exciting applications are just around the corner that will leverage machine learning and other artificial intelligence technologies to better our lives?
Tech Field Day - AI Field Day 3 - Imagining New Applications for AI: An AI Field Day Roundtable
Same video, but cued to the start of my thoughts, this time about a silly idea for automatic naming of Zoom windowsSame video, but cued to my thoughts at the very end about IoT devices like EcoNet, Sense, and Insteon:Imagining New Applications for AI: An AI Field Day Roundtable
Paul is a 30th year IT Pro by day, an 11th year technical content creator at TinkerTry.com and podcaster by night, and a 4th year electric vehicle and ADAS (Advanced Driver Assistance Systems) enthusiast who enjoys sharing his Model 3 learnings at many EVents on weekends.
Web site: https://TinkerTry.com
Twitter: @PaulBraren
LinkedIn: https://www.linkedin.com/in/paulbraren/
Paul is an IT Professional who also creates technical content at TinkerTry.com and is active in the VMware vExpert and VMUG Community. He’s acting as the Senior EV Correspondent for Tech Breakfast Podcast, and he’s helping with EV Club of Connecticut advocacy. He’s also a cautious Tesla “Full Self Driving” beta tester since October of 2021, but he’s been pondering various AI, ML, and ADAS (Advanced Driver Assist Systems) approaches to this public safety challenge for over 30 years.
How did you get into Technology and IT?
During graduate school, I found I was spending more time optimizing the performance of the Silicon Graphics computer used for eye-tracking research than I was studying. After a successful stint in helpdesk support that I quite enjoyed, we also had baby was on the way. So I confess, my actual “career” path in IT really began as I traveled extensively to teach resellers about the joys of OS/2. This lead to growth into a variety of infrastructure, team lead, and customer-facing roles, and I’m so very thankful for all of it!
What do you do now? Tell us a little about your current role.
I’m an Advisory Solutions Architect at Dell Technologies, working with a variety of Enterprise customer across New England, helping them with their datacenter and cloud solutions. ...
Disclosure:I hold no stock and never held stock in any of the companies mentioned in this article.
Excellent, related videos by Matt Ferrell that talk about how AI can help with climate change solutions, including wind turbine placement, along with cheaper and faster climate change modeling using IBM Quantum Computing.
Undecided with Matt Ferrell - How AI Could Solve Our Renewable Energy Problem
Undecided with Matt Ferrell - Quantum Computers Are Coming … But Why Should You Care?
With DDN A3I AI400X2 the AI applications can consume 90 GB/s and 3 million IOPS out of the box. One of the presenters, William Beaudin, Senior Director of Solutions Engineering at DDN, demonstrated real life examples of their customers data sets. DDN power some of the most data intensive workloads and largest ML/DL data sets in the world. The data sets require intensive performance as some of the training applications consume 1 TB/s from storage in real time. With all of the examples of data sets DDN showed, they all required speed at both read and writes. This is an important point and unfortunately some organizations miss this when they implement AI… and why they don’t succeed with their project.
Everyone’s a little bit biased
Maybe you’re thinking – I’m a good person. I’m not biased. But here’s the deal: we all have bias. It is human nature. In my majors in the liberal arts programs we were taught to actively expose our own bias so we could also work to neutralize it.
If having bias is part of being human, we must acknowledge it as we research. When it comes to AI, how can you trust any decision if bias has influenced your data or even your theory?
AI bias has the potential to impact many people, so it’s critical to try and identify and neutralize bias at each stage of the process.
— Paul Braren | TinkerTry.com 🖥️🔌☀️🔋🚗 (@paulbraren) May 19, 2022
Joep, sorry we didn't get to chat more, but it sure was good to (virtually) meet you at #AIFD3 last week! I fondly remember our days of tinkering with GPUs and USB under VMware ESXi, very few are so brave! Long before the Dell spinoff & now @Broadcom era. https://t.co/V93mFV2Tynpic.twitter.com/2of22IcUc3
— Paul Braren | TinkerTry.com 🖥️🔌☀️🔋🚗 (@paulbraren) May 26, 2022
After 6 successful years testing then shipping well over 1,000 Xeon D Bundles, Wiredzone had to stop selling them in mid-2021 due to cost, supply, and logistics challenges. The Xeon D-1700/2700 (Ice Lake D) was a minor refresh for 2023, with Xeon D-1800/2800 (Granite Rapids D) refresh slightly better in 2024, and hopefully Xeon 6 (Granite Rapids-D) much better in 2025 featuring PCIe Gen5, MCRDIMMs, and 100GbE networking, wow! I'm bummed that Pat Gelsinger was apparently ousted from Intel's helm in these challenging times, but I'm also grateful to have had the honor of working at VMware when he was the CIO there. I'll leave it at that, given the whole Broadcom thing.