Altitude

For the record, cell phone reception works relatively well when flying 1,100 feet above a major metropolitan area in an aeronautically snail-paced PT-17 Stearman. Soaring above as a front-seat passenger in the bright yellow biplane, used originally as a trainer for aspiring WWII pilots, I spotted our subdivision below. I texted my wife to let her and our kids know that the pilot and I were directly above our tree-obscured neighborhood. “We saw you!!!” came the thrilled reply. Months before, she had purchased the 30-minute, open-cockpit ride as a birthday gift for me from the flight museum practically nestled in our backyard at the adjacent airfield. On this day, incidentally the 20th anniversary of 9/11, I made time for the change in vertical perspective and presented my ticket. It was brief as flights go, but it was an experience I won’t soon forget.

Our neighborhood, bird’s-eye view.

If you’ve ever coasted far above the earth on a pair of wings and an engine or two, looking down through the window in the pressure-sealed passenger cabin, you can’t help but observe how different your local haunts below appear. For one, at this altitude, you can see many of them simultaneously, and the distances between seem much less distant; my sphere of day-to-day influence on the ground isn’t quite so expansive. Moreover, while my family pinpointed me in the plane, I never could find them on the ground. They weren’t even on the scale of an ant I often spot on the sidewalk in front of our house. The four of them might as well have been invisible, as any other human below.

Maybe it was the effects of the altitude, but it put me into a reflective mood. Down below, where gravity binds us to the earth, it’s easy to feel important when you’re blind to the big picture. Far above, however, every self-important person on the ground disappears. As we glided along for the half-hour ride, I considered this, the nature of ambition, and the sometimes misguided pursuit by certain of us to rise above it all.

If you keep up with the news even remotely, regardless of your source of choice, you’re likely to have come across the name “Elizabeth Holmes.” Presently, she is on trial facing multiple criminal charges of fraud involving Theranos, a company she started in 2003 at the tender age of 19. Once touted as the next Steve Jobs, Holmes took this to heart and deliberately fashioned her likeness in the image of the storied tech titan, even going so far as to sport the black turtlenecks for which he was famous. Her management style, it is said, bred more fear and anxiety in the workplace than a spirit of teamwork or cooperation. To say she was task- versus people-oriented was an understatement.

Some of these tendencies might have been excusable if there were a world-changing product to unveil. Holmes’s single-minded pursuit was to develop a comprehensive blood testing device requiring only a few drops of blood rather than multiple vials. For years, her staff attempted to achieve what was known by others more knowledgeable to be impossible in this physical universe, but she plowed on nonetheless, opting instead to fake-it-till-you-make-it. While scientists and developers behind the scenes in her company for years experienced little more than failure after failure, Holmes ordered the results buried or whitewashed, opting for smoke and mirrors with the public and investors, brazenly lying about the success of her “Edison” device, as it was named. When the pressure mounted and the Edison continued to fall short of expectations, samples taken from patients and volunteers were secretly run through traditional 3rd-party machines and diluted to the required volume, repeatedly returning inaccurate results. When the Edison did work, it was not even remotely comprehensive in the number of tests it was claimed by its founder to perform. In some cases, actual patients made critical health decisions based on the information provided by either method, only later to find they had been duped. Holmes, blindly ambitious to a fault, would rather have let others believe she had a revolutionary product than patiently test it before releasing it or admitting defeat and moving on. Even after whistleblowers blew her cover, she ultimately saw the collapse of her company, and found herself a defendant in the dock, she stubbornly would not admit to any wrongdoing.

Then there’s the story of Stephen Glass, one of my favorites. In the mid- to late-90s, the young writer earned a spot in the offices of “The New Republic,” which, at the time, boasted of being the official in-flight magazine of Air Force One. Glass’s gifts as a journalist were evident almost from the start; it seemed he had a special knack not only for narrative but also for finding unique sources for his fantastic pieces that strained credulity but were nonetheless entertaining to read. The truth, however, was far more interesting.

Reading Glass’s article “Hack Heaven,” journalists at Forbes magazine couldn’t help but wonder why they hadn’t heard of Ian Restil, Jukt Micronics, or the “Uniform Computer Securities Act.” Upon closer inspection, and after grilling Glass and TNR’s editor, it became abundantly clear to the writers that the article was about as rock solid as a marshmallow. The facts were hollow, and Glass was forced into a corner. Rather than confess his sins, Glass invented further fiction to support the fiction. Long story short, Glass lost his job, the majority of his published work in the magazine was found to be completely or partially fabricated, and TNR was compelled to apologize to its readers and struggled to regain its integrity and reputation.

What’s of greater interest and relevance to me about either of these tales, however, reaches further back, long before the very public, colossal fall and even beyond the early expectations of great things to come in their youth. It’s a very young Holmes who came from a once well-endowed family, whose parents felt and applied pressure to make a name for herself. It’s Glass deciding to study law while while penning his articles for “The New Republic,” because it wasn’t enough for his parents that he was employed by a highly-successful, nationally-recognized publication as a writer. In short, it’s the expectations we foist upon our kids and the manner in which they choose to fulfill our hopes for them.

Our oldest did not inherit our lack of athletic prowess, fortunately for her. Rather, she arrived to us at almost 9 years of age with the skills of a natural at any physical activity. Learning to ride a bike took all the painstaking effort of less than half an hour. We first encouraged and nurtured her abilities by enrolling her in gymnastics only to be told a year or two later that the instructors had nothing left to teach her. What she did learn and had opportunity to practice would serve her well in middle school, where, by the time she finished, she had competed successfully in multiple sports and made the cheer team her final year.

Transitioning into high school this year, she wisely chose to limit herself to one or the other and opted to try out for cheer. Much to her pleasure and not to our surprise, she not only made the team but was asked to join varsity, and this as a freshman. She dutifully cheers at weekly games and is one of their few featured tumblers. If this weren’t enough, she made the team’s elite competitive performance group, again, as I mentioned, as a freshman. I’m still amazed at how she can tumble and twist end over end given only a long, open patch of ground.

Before opting out of gymnastics, we discussed what it would mean for us to be the kind of parents who raised the stakes and our expectations to transition her from casual hobbyist to serious contender, as was encouraged. We learned this would involve greater financial resources, daily hours-long practices after the school day, regular weekend competitions, etc. The sacrifices made would also redirect time away from her younger brother and sister as we focused our attention on her God-given talent.

In the end, we did not choose that route, and she was happy not to. We can see she is better for it. The instability and stress of life before her transition into our home was, in part, enough to persuade us that her life could still be great without the overwhelming pressure to be great, so to speak, at just one thing. Had we charged ahead, however, regardless of wins and accolades, I have wondered the impression we would have left on her had we pushed her.

We all need to encourage our kids to be at their best, as should we. I could never dispute that. I’m reminded of the quote from the actor portraying British runner Eric Liddell from the famed movie “Chariot’s of Fire”: “I believe God made me for a purpose, but he also made me fast. And when I run, I feel his pleasure.” These words speak truth to the direction in which our aspirations should be pointed — towards the One who gave us our gifts. As I reflect on the impression our choices have on our children, I’m convinced it’s the only way our and their ambitions can remain pure and admirable.

Stories such as those of Glass and Holmes are instructive about our ambitions. When we strive merely to be better than, all bets are off; ethics and fair-play fall low on the list of priorities. There will always eventually be someone better than us at whatever we do, even it if it takes a little time to discover it, and especially if we are dishonest in our pursuit. When we strive instead simply to be better with an audience of One, we’re truly free to be at our best. I pray our approach with our kids reflects this in their efforts.

I can’t help but hope that folks such as Glass and Holmes might consider this as well next time they find themselves peering down at the ground below.

Screen Time

I don’t recall the precise reason, but it became eminently clear it was no longer a living room accessory. It had silently departed the house in our absence, and we never had a chance to say goodbye. As our parents drove my brother, sister, and me from our grandparents’, where the three of us had spent the weekend, and we made the hour-long trip back home, one of us must have expressed a screen-starved eagerness to play our game system — an Atari 2600 — as soon as we would set foot in the door. Upon hearing this, mom pounced from the front seat with a ready reply. Our anticipation was summarily extinguished by her unwelcome news.

“It broke.” “It messed up the TV.” “The dog chewed it up.” Whatever the reason, all we heard was, “It’s gone.” In the history of parental excuses, whichever mom chose was likely as common as they come. The subtext, however, was, “You were spending too much time on it.” So, with the cold-heartedness of a hitman, she offed it. Even dad, I expect, quietly mourned while simultaneously presenting a united front with our mother. No more heavily-pixelated “River Raid” bombing runs after we were in bed. Cue single tear.

It would be the only game system our parents would purchase themselves for the family while growing up in our single-screen home, and it came as quickly as it went. Even in the 80s, long before the advent of iPads, smartphones, and the conscientious parenting term “screen time,” it was still possible for one’s kids to oversaturate on the latter, my mother believed. As I matured, I inherited a similar belief and learned to approach screens and what they deliver with a measure of caution.

A few years later in June of 1992, following the collapse of the Soviet Union, our family migrated to Ukraine to engage in charitable mission work supported by churches in Texas. Among the wealth of experiences we collected over what would culminate in a full and very meaningful year in our lives (about which I intend eventually to write more), we had the opportunity to live screenless. We rarely encountered a television. Moreover, smartphones and tablets wouldn’t make their introductions to society at large for more than a decade. We were strangers disconnected in a strange land, where even a brief, poor-quality, long-distance phone call literally required scheduling with the phone company ahead of time. While such conditions sound primitive and unacceptable by today’s standards, we found ways to keep ourselves occupied. When a screen simply isn’t anywhere, it ceases to be an option, so attention naturally shifts elsewhere. We learned to live comfortably and happily without it.

Fast forward nearly 30 years, and here I sit tapping out these very words one character at a time with my thumb on a modest rectangular screen that never leaves my side. I find it hard to remember any longer what day-to-day living was like prior to constant connectivity and endless entertainment in our palms. We obviously made life work without this technological privilege and have for century after century of human history, but it only took a meager year or two of the current 100-year span to change us forever. We all take for granted that each of us now carry in our pockets a tool far and away more advanced and complex than the earthshaking machines that sent men to the moon. With that kind of personal power, we all know there’s no going back, regardless of the extremity of any downsides discovered since.

Among this technology’s vast number of advantages — and there are, indeed, too many to name — there is really only one familiar disadvantage that matters: we seldom have the willpower to put these devices down as often as we should, turn them off, and interact with the real world we inhabit instead of staring endlessly into a carefully edited and framed projection of it.

Very early in his work with Apple, Steve Jobs once said dismissively of market research, “People don’t know what they want until you show it to them.” The later success of his products would seem to have proved him right. While he wasn’t the inventor of the smartphone, per se, his “i” devices found themselves in the impatient, hungry hands of the majority very soon after their introduction to the market. Apparently, he alone knew what we really wanted, judging from the volume of cash we appreciatively threw at him. Once software/app developers got a hold of the iPhone and similar devices, they found further ways to keep us hooked until this “want” gradually morphed into a “need.” Whether we’re at home, at work, dining out, or even driving, we feel the itch every inactive moment and impulsively silence reflection by reaching reflexively for our phone. Few of us know any longer how to sit still and quiet with our thoughts.

Ironically, many of the developers of these breakthrough devices and apps see their way differently through the distracted digital fog and distance themselves and their families from what they’ve wrought. Bill Gates is said to have established very strict limits on his kids’ use of technology. Jobs himself stated shortly after he released the iPad that he would not allow his children to make use of one. Many of the movers and shakers of social media platforms shared in the cautionary Netflix documentary “The Social Dilemma” that they significantly limit when their children are able to visit the sites they curate, if at all, and some even go so far as to enroll them in low- or no-tech schools. Jaron Lanier, an early pioneer of virtual reality and well-known voice in Silicon Valley, published a few short years ago “Ten Arguments for Deleting Your Social Media Accounts Right Now,” among other works in a similar vein counseling caution in the technological universe we now inhabit. All in all, it appears very telling that these creators are not the strongest advocates for their creations.

A closer look, though, reveals it’s not the technology itself they’ve rejected but its overuse, abuse, or misuse. Any technology, like any tool, is fashioned for a specific function; but I can use it for different ends if I or others so choose and the functionality affords it. A hammer’s designed purpose is to drive or pull a nail, but certain individuals have been known to employ it effectively as a weapon.

Our middle is currently in a months-long process of demonstrating she is mature enough to acquire a phone (and use it as intended). Good grades and behavior are the criteria. While it’s a great motivator for her, I admit mixed feelings about what waits for her at the end. This new rite-of-passage we Gen Xers and earlier never experienced; we transitioned as adolescents just fine without it, so, naturally, we harbor concerns about what it should signify and when they have earned the right to carry, so to speak. They’ve never known a reality that wasn’t populated with a personal device for every person, so they feel less cautious about the change than us, who remember a time when our attention had vastly fewer interruptions and was more consistently present with the world around.

While she stands to gain, I can’t help feeling like something will be lost. Given the option, we’ve found, kids will almost always pick a screen over any other available activity. It’s clear, consequently, that it takes a great deal of vigilance to monitor, educate, and, most importantly, model how to handle joining the ranks of the connected. If my kids observe me glued to this device, they will naturally assume the same posture.

In 1984, William Gibson published the seminal, critically-acclaimed science fiction novel “Neuromancer,” which, in a nutshell, envisioned a future in which we essentially plug our conscious brains into digital reality. The story directly inspired the later box office blockbuster “The Matrix.” While we aren’t precisely there yet (and I hope and pray we never are), reading the novel 25 years after its publication, I couldn’t help but observe that I lived in a version of Gibson’s vision. There is no need to physically “jack-in” if we can’t pull our eyes from our screens. In a way, even then, we were already there.

The best science fiction, in my humble opinion, does not merely spin a fun and adventurous tale of gadgets, lasers, and spaceships. To the contrary, the greatest among them, I would argue, closely scrutinize the present to a purpose. These stories presciently trace out causality and utilize the platform of the unknown future, wittingly or unwittingly, to describe where we’re headed if we do, or don’t, change course. The visions are often extreme and imprecise, making it a challenge to recognize if we’ve arrived at said dystopia. I once expressed to a roommate incredulity about the likelihood of Bradbury’s future tale of the temperature at which books burn. He replied thoughtfully, “Why burn books if no one is reading them?”

I don’t know where we’re all headed with our screens, but I know there are times I think back to that disconnected year in Ukraine and realize their absence is rarely woven into the stories our family recounts about our life there. We didn’t miss them. Nonetheless, at the same time I appreciate the facility to instantly send pictures to my parents of their grandkids, pull up a detailed driving route to anywhere in the world, and post my ramblings online to someone like yourself (whose attention, incidentally, is still intact enough to see these thoughts to their conclusion; and I thank you for that), I wish sometimes I could return to a time when I wasn’t burdened with the task of monitoring my kids’ online presence and activity or paying responsible attention to my own.

But this is the world we now live in, and we take the good with the bad. So, perhaps we should blame neither the tools nor the toolmakers. They simply give us what we want.