Posts Tagged ‘Avidyne’

I haven’t published in several days. It’s just been crazy. I have a friend who works at Mission Control at the Johnson Space Center, JSC,  in Houston. She has been suggesting for several months that I bring the kids down to get a behind the scenes tour of the facility. With the shuttle program shutting down, over 6,000 people in Houston and 20,000 nation wide will lose their jobs. My friend is slated to be one of the casualties. That meant it was now or never for the tour. Additionally, this past Thursday was the last chance to get on the full motion shuttle simulator. It’s scheduled to be torn down in a couple of weeks. As a pilot, I just couldn’t pass up a chance to fly to main shuttle simulator. The problem was weather. Pretty it wasn’t. In the end it meant leaving the house at 3:30AM last Wednesday to head to the airport. Headwinds were about 25 kts most of the way. We avoided the storms until right at the end. As I turned onto final for the ILS 35L approach at Ellington I was informed there was rain over the field. NEXRAD showed red over the field and I was about ready to head elsewhere but I was informed it was just heavy rain. When the 500′ callout happened I still couldn’t see the runway. I was thinking this was going to be a missed approach with a diversion. Then my daughter said she could see the runway and indeed I could too. Winds were gusty but manageable. We made it but we did get wet unloading the plane. I am glad I had gone up with an instructor and done five practice approaches just a week earlier. It was great having the DFC-100 autopilot in the plane.

We had arrived about 8:30 AM local time. We were tired but decided to not waste the rest of the day. We filled it up with a trip to Space Center Houston. Among other things there was a simple space shuttle simulator. That meant a chance to practice before trying the real simulator. It was humbling for me since both of my kids, Chris 14 and Michelle 10, pwned me. Here is Chris showing me how it should be done.

On a more general techie note, I was surprised at the use of QR codes. They were all over the Saturn V exhibit. As we were to find out later, not all of NASA is this up-to-date.

That’s a QR code at the bottom of the sign.

Thursday was the simulator. When I get the photos off the camera I will publish another update. Just a “heads up.” It was awesome.



I’m back home and connected. Yeah! My kids are happy since World of Warcraft now works well. I’m trying to catch up and realized I haven’t posted in several days. Next week won’t be any better since I will be heading to Houston for a behind the scenes tour of Mission Control. I hope that trip is as much fun as I expect it will be.

Now to the techie stuff. I was flying today and the conversation turned to how things should work vs. how they really work. Of course the initial topic was about flying. I was working through approach procedures using a new autopilot. I fly a Cirrus SR22 equipped with Avidyne R9 avionics. Recently the autopilot was upgraded from the STEC 55X to the Avidyne DFC-100. This is a big upgrade. The STEC understood rate of turn (from a turn coordinator), altitude (air pressure sensor), course error (from Horizontal Situation Indicator), and GPS course. The new autopilot receives input from the GPS, Flight Management System and the Air Data Attitude Heading Reference System. In other words it knows just about everything about the airplane and its condition. It even knows flap position and engine power. The end result is a vastly superior autopilot. Sequencing is automatic (most times – see below). You can put in a flight profile and the plane will fly it including climbs and descents. The operation is very intuitive and a great example of intelligent user interface design. If you are climbing at a fixed IAS (Indicated AirSpeed) and set up to lock onto a fixed altitude the IAS button is green to show it is active and the ALT button is blue to show it is enabled but not locked. When you get to the desired altitude the ALT light blinks green and then goes steady green when locked onto the desired altitude. I could go on and on about how great this is and if you have questions just ask.

Now to more specifics about interface design. When you use the DFC-100 autopilot to fly an instrument landing system, ILS, approach, it is very automatic. If you punch VNAV, vertical navigation, you can  have the autopilot fly the entire procedure including the appropriate altitudes. When the radio signal of the ILS is received and verified correct (all automatic) the system shifts to using the electronic ILS pathway to the runway. So far everything has been very automatic. If you exit the clouds and see the runway you disconnect the autopilot and land. The problem comes when the clouds are too low to see the runway even when you are close and down low. This is a very dangerous time. At the critical point the plane is 200′ above the ground and there is little margin for error. If you don’t see the ground you execute the missed approach. This is where the great user interface breaks down. If you do nothing the autopilot will fly the plane into the ground. In order to have it fly the missed approach the following must happen. After the final approach fix, but only after, you must press a button labeled Enable Missed Approach. At the decision height when you are 200′ above the ground you must either disconnect the autopilot and start the missed approach procedure manually or shift from ILS to FMS as the navigation source and press the VNAV button. I can hear people, including pilots, asking me what the big deal is. The big deal is that this is when you really want the automatic systems looking over your shoulder and helping out. If you forget to shift from ILS to FMS the plane will want to fly into the ground. That’s a very bad thing. The system is still great. Even at this moment it is much better than the old system. I am not saying I would want to go back. I am saying it could be better and that this operation doesn’t fit with how seamless the autopilot’s operation usually is. What the system should do is automatically arm the missed approach. I see no reason for this to be a required manual operation with the potential to be forgotten. The pilot should select the decision height at which the missed approach will begin to be executed. When that point is reach, if the autopilot has not been disconnected, the autopilot should start flying the missed approach including VNAV functionality. That includes shifting the navigation source from ILS to FMS automatically.  The result would be increased safety since the system wouldn’t be requiring command input from the pilot at a critical moment.

The discussion above relates to what I have been covering in this blog. As computing systems improve and move into every area of our lives, issues like the one above will pop up. Everything about the DFC-100 is vastly superior to the old STEC. The issue is consistency of use. As our computing systems get better and better user interfaces, minor inconsistencies will appear to us as big annoyances. Look at the iPad. If you think of it as an eBook reader that lets you view mail and surf the web it is an awesome device. If you look at it as a fun device with simple apps and games it is awesome. As soon as you want it to be your main computer, things like the lack of a user accessible directory structure become big. Compared to the old Newton or the PDA, the iPad and the iPhone are major advances. However, with this new capability comes raised expectations. Developers don’t get to do great things and then sit back. As soon as users get comfortable with the new, next great thing they begin to find annoyances. One of Apple’s strengths has been minimizing these annoyances but even on the best devices they are there. Consistency of user experience is a big deal. Getting there is tough. My point is that small details matter. How the icons look, how smooth the scrolling is, the animation when actions are taken are all small things that matter. One of the reasons for the success of the iPad and iPhone has been this consistency and sweating the details when it comes to the user interface. As we merge devices and functions in the post PC world it will be critical that these disruptions, the non-transparent use scenarios be identified and fixed.