The Unexpected Perspective
The Implications of Darwin and the Big Bang for Christians ... and Everyone Else


Range anxiety continues to be a problem with electric vehicles. Wireless induction recharging may help to reduce range anxiety.

            One of the great fears about owning and driving an all-electric vehicle is running out of juice.   Most every driver has had the uncomfortable experience of seeing the gas gauge rapidly approach "Empty" and having to search for a gas station.  However, unless you're driving out in the country, a gas station is probably not too far away.  And if the dreaded event happens, and you do run out of gas, you can fairly easily get a container of gas from the nearest station, put it in your tank, then drive to the station.  A giant nuisance … maybe even a giant embarrassment … but actually pretty easy to resolve.

            But what if you're driving an all-electric vehicle?  There's no equivalent to a can of gas to stick in your tank.  So far as I know, no one makes a portable battery that you can attach to the car to give it enough fuel to get you to the nearest charging station.  And unlike gas stations, electric charging units are still not particularly commonplace. 

            Then when the driver is able to re-charge the vehicle, it takes a long time.  While the typical driver can put 15 or so gallons of gas in a vehicle in five minutes, re-charging an electric vehicle usually takes a minimum of 30 minutes at a high power charging station, and about six to eight hours if one plugs into an ordinary outlet.  The thought of spending 30 minutes in a charging station is very practical much of the time. 

            So it's quite understandable that lots of people are nervous about driving an all-electric vehicle for fear of running out of juice.

            It's an obvious problem, but maybe the real problem is that we're all working off of the wrong "mental model".  The "mental model" I'm referring to is the one about filling the tank of your auto with gas.  We're all in the habit of letting our gas tanks get close to empty, then refilling the tank until its full.  This model makes perfect sense when you think about the "nuisance factor" of putting petroleum in your auto or truck.  Filling the tank completely makes complete sense when the task is so annoying.

            But what if there was a different "mental model" about filling the tank?   The model I'm thinking about is related to the old riddle, how do you eat an elephant?  The answer, of course, is, one bite at a time.  If we apply this idea to putting fuel in your auto or truck, it would mean that we would add fuel to the vehicle a little bit at a time.  If it is a conventional vehicle, it would be as if you added one gallon of gas at a time.  Of course, it's pretty unlikely that you'll stop at a gas station fifteen different times, each time adding one gallon of gas, to refill the tank.  The same thing is true of an all-electric vehicle.  You're very unlikely to stop the vehicle, plug in to a charging station for a short time, then do that all over again a short time later.

            There's no way to get petroleum into your vehicle without attaching a hose to it, but there actually is a way to get electricity into an electric vehicle without attaching a cord to the vehicle.  As I'll show below, this could provide a convenient way to "eat the elephant", or recharge your electric vehicle.

            The idea is based upon the concept of induction.  Basic electromagnetic induction works by supplying power to a charging station that includes an induction coil.  The electric current causes the induction coil to create an electromagnetic field around itself   capable of transferring power to a second induction coil nearby.   Applying this principle, one can recharge an electric vehicle without plugging it in to the electric supply – cordless recharging.  The idea of wireless transfer of energy was first suggested more than a century ago by renowned inventor Nikola Tesla.

            Several companies now sell devices to provide cordless recharging.  One, named Evatran, sells a combination "pad" and "pickup" for between $ 2500 and $ 4000.  Another company, named Hevo, sells this for about $ 3,000.  The "pad" is attached to the source of the power supply.  The "pickup" is equivalent to the plug on the vehicle where one would normally plug in the power supply.  Electricity travels through the air from the pad to the pickup.

            Evatran was started in 2009 and the started selling its product to the public in 2014.

            BMW and Daimler, two of the big German automakers, have announced plans to develop wireless recharging systems, to be installed in a garage or carport.

            While it isn't an automaker, Qualcomm is reported to have developed a number products and tools that will be valuable in wireless recharging.

            How efficient is this "through the air" recharging?  Not quite as good as if a cord is involved, but it's pretty good.  When you plug in the power supply, you get about 95 to 99% efficiency.  In contrast, when you do it using an induction system, it's about 84 to 90% efficient.      

            The concept of "one bite at a time" re-charging is actually being used on some bus systems.  Milton Keynes, a town northwest of London in the UK, has one of its bus lines outfitted with induction charging.  Eight buses on Line 7 of the Milton Keynes system are all-electric, and each of the buses has been outfitted with an induction charging system.  When a bus reaches the end of the line and is ready to turn around, it pauses for two to four minutes for a quick recharge, then starts on another bus run.  Each bus is recharged this way throughout the day.

            The operators of the Milton Keynes bus system say they save about 80 cents/mile by running electric buses rather than diesel ones.  This is a combination of lower fuel costs and reduced engine repairs.  The eight electric buses on Line 7 run a combined 425,000 miles/year, so the marginal cost savings is about $ 340,000/year. 

            How much does it cost to install such a system?  Milton Keynes uses a charging system built by IPT Technology, a German company, and it costs about $ 130,000/charging pad.  Assuming one pad is placed at each end of the trolley line, the payback on any single trolley line is about 9 months, making this a very good investment.  The Milton Keynes bus operator is looking to convert other lines on the system.  Besides Milton Keynes, such wireless induction charging bus systems are run in Salt Lake City, five cities in California, and European cities such as Utrecht, Genoa, and Turin.  Los Angeles plans to install such a system next year.

            Wireless induction systems make great sense on bus lines, but could the same concept be applied elsewhere? I think the idea can also be applied to automobiles.  Now autos rarely travel along the typical fixed route of a bus, but I think the idea of periodic wireless recharging of autos and other vehicles, the core of the Milton Keynes system, is quite reasonable. 

            The obvious place to do this is in parking lots.  It's become increasingly common to see electric charging stations in parking lots, but it still requires one to attach a cord to the auto or truck.  Why not make it so the driver doesn't have to do anything more than park the vehicle in the space?  Wireless induction creates the possibility for that.  Here's how:

  • Imagine that the vehicle is outfitted with a standard "pickup".  It could be installed by the automaker.
  • Imagine, also, that the parking space has a built in "pad". 
  • The driver would park the car in the space.  He or she could then use a mobile phone app to instruct the pad to recharge the vehicle while it's parked.  Charging could be by the minute, with the "charging" charged to the driver's credit card.
  • Besides credit card billing information, the app could include instructions about whether the vehicle should be charged if peak load pricing is in effect.

More and more, drivers don't fumble for change on parking meters, choosing instead to pay for parking using a mobile phone app.  Why not do the same for electric charging?  Pretty much any parking space could be outfitted with a charging pad.

            The problem, of course, is who is going to pay to install charging pads?  How about the electric utility?  They have a natural interest in selling power.  But other parties could have an interest in selling electric power if they could be shown a way to make money doing it.  So how could someone make money selling power this way?

            Anyone who runs a business that has customers driving to store/office might want this.  Imagine having your customers park at your place of business, conduct business with you, then have the benefit of getting the auto at least partially recharged.  Doctors and dentists should love this.  I can't think of a time that I've ever been to a doctor or dentist and it took less than one hour.  In the space of one hour, my electric vehicle could at least be partially recharged.  Even if it recharges at the rate of a wall outlet, which usually takes 6 – 8 hours to recharge a vehicle, one could at least get a recharge of about 1/8 the total in an hour.  Typical electric vehicles now have a range of at least 250 miles, so one hour of recharging should provide at least an additional 30 miles.  If it's a fast recharger, it could easily completely re-charge the vehicle.

            Why might a business want to make this type of investment?  It might do it simply as a way to attract customers, but a more likely reason would be as a way to generate additional revenue.  The company could charge customers for electricity by the minute, the rate depending upon how fast the recharging system operates.  Presumably, it would charge a premium over what the driver would pay for electricity if he or she did the recharging at home, but because this is a convenience, most likely the driver will be willing to pay a premium. 

            Could wireless re-charging of vehicles be a viable business?  If re-charging pads can be sold for $ 2500 to $ 3000, I believe one can easily create a viable business with as little as $ 4.00 to $ 5.00 in revenue per charging pad per day.  This would be in addition to the cost of the power.  Assuming the charging station could be priced at $ 2.00/hour plus power, it would only need to be used 2 to 2 ½ hours each day.  Would drivers pay $ 2.00/hour to use such devices?  I believe they would, especially if they're concerned about running out of power.  Simple convenience suggests this makes a lot of sense.

            Alternatively, the electric utility might install these charging pads, then pay some type of rental fee to the owner of the parking lot.  Again, this could be a revenue opportunity for both the parking lot owner and the electric utility.  An electric utility might want to make these available, even providing the power for free at certain times of the day, simply to help reduce the load at other, peak times.  It's been reported that utilities in California are now experiencing excesses of power during the day due to solar installations, so providing free charging at certain times might benefit more than just the driver.

            Using the recharging service would be completely optional to the driver of the auto.  They key is to create a convenient, simple experience for the driver.  The ideal scenario is for the driver to park the car, the driver to use an app on his or her phone to instruct the charging device to recharge.  It might even be done more or less automatically.

            The core idea here is convenience and user experience.  The more that it can be made easy, the more likely is it for drivers to want to adopt electric vehicles.  The induction concept provides another way for the driver to avoid running out of power, and also a convenient way to add power.

            Induction based wireless charging will provide a convenient way to overcome the problem of "range anxiety", the fear of the car running out of juice.  It should provide one more way to make electrics the vehicle of choice in the days to come.

post a comment

Recent studies suggest it's more complicated than we thought.

            Hurricanes have definitely been in the news this year.  Storms named Harvey, Irma, and Maria did billions of dollars of damage.  We still haven't counted up the full cost.  Four weeks after Hurricane Maria tore up Puerto Rico, the vast majority of the island is still without power. 

            Without question, these were terrible storms.   Many quickly proclaimed that these storms must have been the result of global climate change, and the public should get prepared for many more.  But is that an accurate assessment?  The funny thing is that the same thing was said after Hurricane Katrina pounded the Gulf Coast in 2005, causing incredible destruction, including an unprecedented flood of New Orleans.

            And then another funny thing happened.  Not a single hurricane struck Florida, and not a single major hurricane (Category 3+) hit anywhere in the USA, for a decade. Of course, what about Hurricane Ike and Superstorm Sandy? Both were terrible storms, but Ike was Category 2 when it hit, and Sandy wasn't even officially a hurricane.  

            So what's happening?  Greenhouse gas emissions are causing changes in our climate.  I don't question that, but what that means seems to be uncertain.   That likely means even more uncertainty in the future, unless and until scientists gain a better understanding of the relationship between climate change and hurricane frequency and intensity.

            As a starting point for this discussion, consider how and why hurricanes form.  The most basic reason is because of a combination of warm ocean water and certain types of intense thunderstorm activity.  The rule of thumb is that the water temperature must be at least 79.5 degrees Fahrenheit to form and/or sustain a hurricane.  The absence of warm water seems to prevent hurricanes from forming and from sustaining.  That's why hurricanes break up when the encounter land.

            Water temperature in the Atlantic Basin didn't dramatically decrease during the years between Katrina and the 2017 hurricanes, so why did hurricane intensity go down?  Most likely because hurricane formation, strength and durability depend upon multiple factors, not just warm water and thunderstorms.  So let's take a look at what those appear to be.  I think you'll see that this is indeed very complicated, and thus we should be careful about our predictions of hurricane intensity due to global climate change.  With that in mind, let's take a look at what else may at play.

            One very key factor is what is called wind shear.  It has to do with differential wind speeds between the surface and up to the troposphere, around 40,000 feet above sea level.  Wind shear is a factor in both hurricanes as well as storms on land.  The interesting thing is that on land, wind shear can make storms more dangerous, especially with what are called supercells, but wind shear is probably the worst enemy a hurricane can have.  Wind shear, if it comes from a certain direction, tears hurricanes apart.  Interestingly, if the wind shear is from a north/south direction, it doesn't seem to affect the hurricane, but if it is from the east/west direction, it can be deadly.

            A 2007 study done by researchers at the University of Miami said that wind shear would likely reduce the frequency and intensity of hurricanes in the Atlantic Basin and the Eastern Pacific.  The data for the ten years up to 2017 seems to bear that out, at least for the Atlantic Basin.  At the same time, the Miami researchers said wind shear would likely have little effect on the frequency and intensity of hurricanes in the Western Pacific and Indian Oceans.

            Why the difference?  The Miami researchers hypothesized that this difference is due to what's called the "Walker circulation".  This phenomenon has to do with the interplay between a high pressure system that tends to reside over the Eastern Pacific and a low pressure one that tends to reside around Indonesia.  These two systems create a pressure gradient, and the interplay between the two is dynamic.  From time to time, the pressure gradient weakens, or reverses, and causes a phenomenon increasingly described in weather reports: El Nino.  Conversely, the Walker pressure gradient periodically strengthens, causing the opposite phenomenon: La Nina.  El Nino tends to warm the waters of the Eastern Pacific.  In the USA, the downside of El Nino is that there tend to be more tornadoes and other bad weather in the Southeast in the wintertime.  But the nice side benefit of El Nino is that hurricane activity in the Atlantic Basin goes down.  Conversely, in a La Nina, it tends to intensify.

            The Miami researchers hypothesized that Walker was weakening, suggesting more El Nino events.  That may have been happening over the past few years, and might explain the respite Florida had from hurricanes for ten years.  However, another study, called the Twentieth Century Re-Analysis Project, calls that into question.  The Twentieth Century Project has been accumulating world wide weather data for the period 1851-2014.  Researchers involved in the project say that they do not see any long term weakening or strengthening of the Walker circulation.

            In studying wind shear, other scientists have made another interesting observation.  It appears that where hurricanes are gaining their greatest intensity may be shifting.  Historically, hurricanes have been strongest in the lower latitudes, both in the Northern and Southern Hemispheres.  That makes intuitive sense because water temperatures closer to the Equator are likely to be more intense.  That's probably still the case, but peak hurricane intensity seems to be shifting away from the Equator, both in the North and South.  The cause of the move: wind shear.  The researchers found that closer to the Equator, increasing wind shear is weakening hurricanes, but farther away from the Equator, wind shear may be weakening, at least in a relative sense.  That would suggest that the latitude where hurricanes are most intense would be increasing over time.  In fact, the researchers note, that's what seems to be happening.  The bad news about that is that population centers in higher latitudes can expect more intense hurricanes over time, other things being equal.  Once again, however, the key variable appears to be wind shear.

            Why were Hurricanes Irma, and Maria so strong?  The evidence suggests that wind shear was pretty weak this year in the Atlantic Basin.

            Strong wind shear also seems to explain why hurricanes don't form in places like the South Atlantic.  Dr. Bill Gray, the noted Colorado State meteorologist who has long been an authority on hurricanes, said this is clearly the case for the South Atlantic.  Only one known hurricane, Hurricane Catarina, has formed in the South Atlantic, that being in 2004, even though water temperatures there are very warm.

            When it comes to hurricanes, wind shear may be our best friend, or at least the best weapon against hurricanes.  The good news is the hypothesis that climate change may be increasing wind shear.  In that case, higher water temperatures would be offset by increased wind shear.  That could explain why we had ten years of relative calm in the Atlantic Basin, bookended by several years of bad hurricane activity.

            But wind shear isn't the only variable to consider besides water temperature.  Another factor is what is called the Atlantic Multidecadal Oscillation.  Researchers such as Rong Zhang and Thomas Delworth have studied not only Atlantic Basin hurricanes but also rainfall in the Sahel region of Africa and the Indian Monsoon. Incidentally, Gilbert Walker of the "Walker circulation", described above, became famous because of his identification of patterns in the Indian monsoon at the beginning of the 20th century.  Zhang and Delworth, as well as others, have noted the relationship between these seemingly disparate weather events.  Thus, the frequency and intensity of Atlantic Basin hurricanes, as well as Sahel rainfall and Indian monsoons, may depend upon the Atlantic Multidecadal Oscillation (AMO).   What impact is climate change having on the AMO?  Unclear.

            But an even more intriguing, and recent, study suggests even another variable in determining hurricane intensity.  Earlier this month, Michael Toomey published findings online in Geology that about 12,000 years ago, Florida was ravaged by severe Category 5 hurricanes. The USA mainland was struck by Category 5 hurricanes only about four times in the past 100 years, but Toomey suggests it may have been worse back then.  Here's the amazing thing: Toomey says that the water temperature was likely lower back then. 

            How could that be?  Toomey believes that the hurricane suppressing effects of cooler water were outweighed by the side effects of slower ocean circulation. So the water might have been cooler even than now, but ocean currents were such that extremely powerful hurricanes ravaged Florida.

            That was 12,000 years ago.  What evidence does Toomey have?  He studied sediment cores of what are called Turbidites from the Dry Tortugas, a series of islands near Key West, Florida.  Turbidite is a rock that forms when sediments are disturbed and flow down across the ocean floor.  Turbidites are often the result of earthquakes, but there is no evidence of earthquakes in the Dry Tortugas, so there must be a different explanation.  Toomey believes the Turbidites were formed as the result of intense hurricanes.

            Toomey was able to measure the size of the Turbidites.  Those from about 12,000 years ago averaged 23 microns in diameter whereas more recent ones average 19 microns in diameter.  Micron size thus served as a proxy for hurricane intensity.

            What are we to conclude from all of this?  I think the key takeaway is that the effects of climate change are clearly complicated, maybe more complicated than we ever thought.  Once again, I am not denying that greenhouse gases are causing climate change.  However, what exactly is the relationship between climate change and hurricane frequency and intensity is up in the air.   Warmer waters certainly would suggest greater frequency and intensity, but as noted here, wind shear, ocean circulation, rain patterns on other continents, and the Atlantic Multidecadal Oscillation, also seem to play important roles.

            Given the reality of climate change, perhaps the best we can hope for is that climate change will affect wind shear, ocean currents and the oscillation in ways that will tend to reduce hurricane intensity and frequency.  In the meantime, when disasters like Hurricanes Harvey, Irma, and Maria occur, let's focus on helping victims.  We know what difference that will make.

            But even if hurricanes are becoming more frequent, and stronger, there are things we can and should be doing.  However, as I pointed out in a recent blog post, what we should be focusing on actually doesn't have anything to do with climate change.   That's not because climate change is unimportant (it is), just that we can and should focus on things we can more immediately control.  Those include strengthening building codes, controlling construction in flood zones, and stopping the artificial subsidization of flood insurance.  Doing those things will help us to control, or reduce, the cost hurricane damage.  We're not going to eliminate hurricanes, even if we completely eliminate climate change, but we can significantly reduce the cost of hurricane damage if we pursue some of these policies.



post a comment

The Underlying Reason We're Wasting Time When We Listen to the Predictions of "Experts"

            Ever wonder how predictions turn out?   When it comes to predictions about greenhouse gases, it's actually a case of great news and lousy news.

            First the great news:

  • Between 2005 and 2016, while the US economy grew by 17%, energy usage during that time actually was flat, yet CO2 emissions decreased by 14%;
  • Sulphur dioxide emissions have decreased by 82% since 2005;
  • Carbon dioxide emissions from coal are now back to the levels they were in 1978;
  • Carbon dioxide emissions related to electricity generation are down 25% from 2005 levels;
  • Energy costs now represent only 4.5% of household spending, the smallest share ever recorded.

Whether or not you believe in climate change (and I definitely do), you should find these data, as reported by the Natural Resources Defense Council, to be very encouraging.

            So what could be the lousy news?  It turns out the predictions made around 2006 for what 2016 would turn to be were wildly inaccurate.  Consider the chart above:

  • The 2006 prediction for CO2 emissions was 24% higher than the actual result, meaning that we actually reduced emissions substantially below where we expected;
  • The 2006 prediction for coal power generation was off by nearly half;
  • Installed solar and wind capacity turned out to be far higher than forecast.

This represented just a ten year prediction, and the prediction was pretty inaccurate.  Fortunately, it went in a favorable direction.

            These predictions are made every two years by the US government's Energy Information Administration.  Those who have studied the projections report that the projections are consistently wrong, sometimes spectacularly wrong.

            But it would be unfair for me to single out the people who made these energy predictions.  That's because the field of "lousy predictions" is quite crowded.  Consider some of the following:

  • A study done at Hamilton College in New York about the accuracy of the predictions of political pundits found those predictions were no more accurate than a coin toss. 
  • Governments tend to do a lousy job of picking high tech enterprises to back, what's sometimes referred to as "industrial policy".   Anyone remember the solar power company Solyndra?
  • If someone in 1987 was predicting what 2017 would look like, do you think they would have predicted any of the following:

          .. The demise of Japan as America's top economic competitor, replaced by China?

          .. The rise of the Internet, from an academic/military project almost to the     backbone of the world economy?

          .. The ubiquity of mobile phones?

          .. The growth of renewable energy?

          .. The reality of electric powered autos and airplanes?

Unfortunately, our record of making predictions about technology, as well as its impact on society, is about as good or bad as that of political pundits. 

            There is one big exception: Moore's Law.  Gordon Moore first formulated this "law" – that the number of transistors on a chip would double about every 18 months to two years – back in 1965, and the projections have been accurate for 50 years. 

            Which leads me to believe that the only three things we can really count on are the usual death and taxes, and also Moore's Law.  Most every other prediction should be taken with the proverbial grain of salt.

            Much beyond Moore's Law, our capacity to predict the impact of technology on our future is pretty lousy.  I think this observation can be applied across many technology topics, but I'd like to limit the discussion to greenhouse gases and global warming.  So what are the implications?

            The first, of course, is that we should take our forecasts of what the world will look like in 2027 and 2047 not with a grain of salt but rather with a heaping portion. Maybe we should simply ignore them.  Not because they were poorly constructed and not well thought out, and not because the underlying science is bad (it isn't), but rather because there may be simply too many variables.  Let's go back and consider why some of the 2006 predictions were so far off.

            One likely explanation is the large substitution of natural gas to replace coal in electric power generation.  The chart shows a giant substitution.  Why did this happen?  The key reason is because natural gas prices went down so much.  And why did those gas prices go down so much?  It was because of the fracking revolution. 

            Not many people predicted that.  Instead, you may recall, a few years ago there were predictions of "peak oil", meaning that the world was at, or near, its peak potential production of oil and that production would soon permanently decline.  I haven't heard much discussion of that idea recently. 

            Another is because wind and solar have had much greater acceptance than was predicted.  Hopefully, the current 2027 and 2047 are similarly understated. 

            The reduction in total energy consumed was definitely greater than expected.  One might argue that that was the result of lower than expected economic growth beginning in 2008.  Perhaps.  However, a more likely explanation is that business and industry continue to get more efficient.  The lean manufacturing revolution has certainly had an impact in that regard.  Another reason is because of the greater role of services, and relatively lower percentage of manufacturing, in the economy. The USA produces as much as it ever has, but it's doing it today with far fewer workers and other economic inputs.  Huge amounts of waste have been removed from processes.  Increases in efficiency explain why households are allocating a record low percentage of their budgets to electricity use.

            Not only do we do a lousy job of predicting the future, we humans also tend to do a lousy job of preparing for distant events.  Maybe that's because we're so good at responding to immediate threats and dangers.

            Why are we so good at responding to immediate threats?  It's because we're genetically hardwired to do that.  After all, when confronted with immediate threats, such as poisonous snakes and predatory animals, humans are very adept at rapid black and white thinking.  We may occasionally over react, but better to do that and still be alive than to under react and wake up dead.  Doubtless, a number of ancestors didn't respond quickly, and failed to pass their genes along.  We're the progeny of those who were very effective at responding to danger, real or perceived.

            But what's the typical reaction when the average person learns that something they're doing today could create a problem in ten, twenty or thirty years?  How most people approach diet and exercise should be instructive.  For most people, making changes to diet and exercise today in order to prepare for a problem that will likely occur in ten to thirty years doesn't get much response.  Do you really want to pass up that nice dessert after your dinner this evening in order to prepare for a potential problem with diabetes in 10 or 20 years?  Yes, a certain percentage of the population will do that, but most people don't. 

            Survival for humans, like all other species, has always depended upon successful reproduction from one generation to the next.  What might happen in 30 to 50 years just has never been relevant … until maybe now.  Genetically, however, every species is equipped to deal with only the next generation, not the distant future.

            So based upon all of this, when it comes to predictions about the future climate, I recommend the following:

Tune in ESPN's Sports Center or HGTV and Turn Off CNN, Fox, and MSNBC

            Why turn off CNN, Fox, MSNBC and their brethren?  Because based upon what I've pointed out, there's nothing particularly authoritative about what they're saying.  If their predictions are about as accurate as a coin toss, why not just get out your Ouija Board?  Notice, I'm not saying one network is right and the others are wrong.  The predictions of both progressives and conservatives are almost uniformly lousy.  Unless you're watching them for the pure entertainment value, if you want to watch politics, try C-SPAN. 

            Now if you're just watching TV for entertainment, consider what we do at my house: my wife watches HGTV and I watch SportsCenter on ESPN.  With hundreds of channels available, I'm sure you can find entertainment without the pretense of authoritative prediction.

            Now if you really do want to get the 2027 or 2047 predictions of the energy and climate parameters shown above, given everything I've said above, you may be searching for your version of the Holy Grail. 

            So does that mean everyone should just give up and tune out?  No, while we're pretty lousy at predicting the future, we're actually pretty good at creating it, if we focus our attention in the right places.  So with respect to climate change, what are those "right places"?  Let me suggest several of them.

#1: Encourage Funding of Basic Research, Technology, and Business Startups

            If you look back at why things didn't turn out, a key reason has to do with technological change.  Changes in fracking technology, as well as wind and solar technology, have profoundly re-ordered energy markets.  Then there's Elon Musk and Tesla.  Anyone want to claim they predicted that?  The investments that get made in technology will likely have a big impact over the next 10 to 30 years.  So how do we encourage this?  Here are several ideas:

  • Encourage funding of basic research at NASA and DARPA
  • Encourage basic research at universities and other institutions

Why NASA and DARPA?   Many technological innovations have been spun off from the space program over the past sixty years.  DARPA has conducted lots of research on behalf of the US military.  DARPA really did help create the Internet.  Both entities have produced impressive results that have been commercialized.  Moreover, both Democrats and Republicans seem to like NASA and DARPA.

            Another idea is to encourage business start ups.  Silicon Valley is, of course, the model here, and what's come out of Silicon Valley really impacted change between 1987 and 2017.

But we should be encouraging this all over the country.  A recent book by Ross Baird, entitled The Innovation Blind Spot, suggests that there's a great deal of innovation happening in places other than Silicon Valley, New York, and Boston, but we're overlooking much of it.   In fact, the next big idea may be from Santa Fe, not San Jose, if we would just pay attention.

#2: Focus Regulatory Change Efforts at the State and Local, Not National Level

            Unless you just arrived from Mars, you know that government at the federal level in the USA is grid-locked.    Each side blames the other one.  It's not likely to change any time soon.  Liberals and progressives have been whining that President Trump took the USA out of the Paris Climate Accord, and that there's a systematic effort afoot to deny climate change by the Administration. 

            Probably so.  So what are we going to do about it?  I say, stop whining and focus on what can be done.  Change may not be practical right now at the Federal level, but there's huge opportunity at the state and local level.  Here are some ideas:

  • Change building codes, which are local level, to encourage more environmentally friendly building materials.  We often forget that building materials make a huge contribution to greenhouse gases.  Lots of possibility here.
  • Take a cue from people like former New York Mayor Michael Bloomberg, who led a major effort while mayor to reduce greenhouse gases in New York.
  • Change how public utilities are regulated.  Most utility regulation is state and local.

#3: Find Ways to Incentive Businesses and People to Reduce Greenhouse Gases

            Recall how I noted that humans do a lousy job of reacting to problems that are 10, 20 or more years in the future, but that we do a great job reacting to the immediate things. 

That's part of the problem of getting people to be concerned about climate change.  But we humans tend to do a great job reacting to incentives that are right in front of our faces. 

            So if we want people to react to greenhouse gases, we're much more likely to get a response if we offer an immediate incentive that will bring long term benefits.  One possible incentive is to offer electric utilities a higher rate of return on wind and solar installations than fossil fuel ones.  Businesses, and individuals, react to incentives that are right in front of them, much more so than distant threats.

            Most likely, when we arrive at the year 2026, we'll find that the predictions we made in 2016 are significantly off.  If history repeats itself, we'll actually have constructed far more renewal energy capacity by 2026 than we've been predicting.  Even better, our CO2 emissions will be a good deal lower in 2026 than we've projected.  It's too bad we're lousy forecasters, but on the other hand, we've demonstrated that we're pretty good at creating technology that makes the predictions wrong.  So ignore the pundits and focus on the places where real change occurs.


post a comment

Lots of people are upset that President Trump is gutting the Clean Power Plan. Here are some practical responses to this.

            For many people, particularly environmental activists, the sky did really fall when US President Donald Trump took steps to gut the Clean Power Plan, the cornerstone environmental action of the Obama Administration.  On its face, the action looks like a body slam to steps taken to limit coal use in electric generation.  So if, like me, you're concerned about greenhouse gas emissions and global warming, what is the right response?

            The first thing I say is, time to stop whining about it.  The sky is not falling because of the Trump Administration's decision.  Instead, read on for some practical things that can be done.       

            Does Federal policy on climate really make that much difference?  I'm not so sure; and if it doesn't make that much difference, we should focus our attention on the things that really do make a difference.

            President Trump took his action to help save coal.  Scientific American reported that coal use was down 9% in 2016, the third year in a row that overall coal use has declined.  A good example of this trend is Duke Energy, a major electric utility in the Southeast.  Duke hs gone from 70% of electricity generated by coal in 2008 to 42% today.   So let's first ask, why is coal consumption going down?

             It's been declining for three key reasons: 1) Americans haven't been using as much electricity as before; 2) natural gas has become more competitive; and 3) alternative energy such as wind and solar have also become more cost competitive. While each of these factors is different, they share something in common: they're all about economics and have virtually nothing to do with Federal government policy.   

            So will Trump's new policy – gutting the Clean Power Plan – make a meaningful difference?  Many experts say that the trend away from coal is going to continue, even though President Trump effectively gutted the Clean Power Plan.

            So just why do I think Trump's decision will make a significant difference?  One part of the argument has to do with the plans of electric utilities themselves.  26 of the 50 US states sued to stop the Clean Power Plan from being implemented.  Presumably, these were the states controlled by Republican opponents of the Clean Power Plan, and skeptics of global warming.  Reuters did a survey of the 32 electric utilities that operate in those 26 states.  Here's what they told Reuters:

  • 20 said the Trump order will have no impact on their plans to reduce coal usage
  • 5 said they're reviewing the implications of the order
  • 6 provided no response
  • Only 1 of the 20 said it would prolong the life of coal facilities.

Even if the other 11 who were non-committal end up responding in a way favorable to Trump, nearly two thirds said they wouldn't. 

            I think one can make the argument that the Trump action against the Clean Power Plan is more sound and fury that signifies nothing. 

            But what if I'm wrong?  I readily admit, I could be completely wrong.  So assuming I'm wrong about the impact of Trump's action on coal usage, let me suggest five practical things that can be done to continue, even speed up, the demise of coal. Some of these may be a bit unexpected, but that's a key objective of this blog: provide unexpected perspectives on issues.  Here's what I think can be done:

#1: Promote Use of Fracking

            Promoting fracking may turn a number of environmental activists off.  I appreciate that, and I agree that fracking has environmental consequences, but promoting this in the short term makes great sense if your objective is to reduce greenhouse gases.  If you think fracking stinks, please hold your nose for a just a moment while I explain my thinking.        

            Let's go back to the two key reasons why coal is a dying industry: 1) improving economics for wind and solar; and 2) natural gas is more cost competitive.  The reason natural gas has become so cost competitive is because of the dramatic increase in supply in the USA.  That's due to one thing – fracking technology.  Admittedly, wind and solar are definitely more environmentally friendly that natural gas, but wind and solar cannot possibly replace coal the way that natural gas can, at least in the short term.  So for the short and near term, natural gas is a great alternative to coal.  Coal plants can be converted to natural gas, but you can't turn a coal plant into a solar or wind facility.  While it produces greenhouse gases, too, the greenhouse gas impact of gas is substantially less than coal.

            If you really don't like fracking, the next best alternative in the short term is to  import liquefied natural gas, especially from places like Qatar.  That however, has two huge disadvantages: 1) it worsens the balance of payments; and 2) it increases dependence on energy from the Middle East.  In comparison, promoting domestic fracking is one of the two best strategies for reducing coal usage.  The irony, of course, is that while the Trump Administration is trying to promote coal usage, its simultaneous promotion of the oil and gas industry works against coal.

#2: Promote Usage of Smart Metering by Utilities

            A second way to counteract the Trump action is to encourage electric utilities to expand usage of smart meters.  A smart meter can be remotely managed and can provide minute by minute information about electric usage.  Smart meters provide advantages to customers, utilities, and to the environment.   The advantages to customers include:

  • Far better data about usage
  • Useful data to help the customer adjust habits in a way that will reduce monthly bills
  • Reduce blackouts.

Smart meters are also very advantageous to the electric utility.  They:

  • Eliminate the need to send an army of people out to read meters
  • Provide more timely information about the electric grid
  • Permit the utility to utilize resources more efficiently
  • Permit dynamic pricing (i.e., different prices depending on the time of day)
  • Help avoid the cost of building new plants.

They're also beneficial to the environment because they reduce the need for new plants.

            Unquestionably, smart metering is something that's advantageous to consumers and the utility companies, themselves, and adoption should lead to reduced carbon emissions.  The even better news is that adoption has nothing to do with the Federal government.  The Trump action has zero impact on this.

            If smart metering is something that benefits companies and consumers alike, how do you encourage its adoption?  Get state utility commissions to provide incentives to companies to adopt the technology.  

#3: Get Investors to Pressure Electric Utilities to Switch Fuel Sources

            In the past few years, more and more companies have come under pressure from various advocacy groups, including investors, to change policies.  Climate change is an emerging area in that regard.  Interestingly, the Norwegian Sovereign Wealth Fund has been pressuring American utility companies not to build coal plants.  Why would this make any difference? 

            Norway has benefitted tremendously from drilling for North Sea oil.  Unlike citizens of most other countries with large oil deposits, the Norwegian government wisely established as sovereign wealth fund using royalties from North Sea oil.  The fund now has more than one trillion dollars in assets.  Needless to say, the Norwegian Sovereign Wealth Fund is a force to be reckoned with in investment circles. 

            Those concerned about greenhouse gas emissions will be pleased to learn that the Norwegian fund has been pressuring American electric utilities not to build coal plants.  Other investors and investor groups are doing the same.  If enough investors do this, I'm confident that electric utilities won't be building many coal plants, if any, irrespective of Trump Administration policy.  If your investors and customers pressure you to dump coal, doesn't matter much what the President thinks.

            Are there many investor groups that could put pressure on electric utilities to avoid coal?  I think there may be more than anyone realizes. 

#4: Encourage Foreign Investors, Particularly Canadians, to Keep Buying US Utilities

            Another unexpected approach is to encourage foreign companies, particularly Canadian ones, to purchase American electric utilities.  There are a number of reasons why it's attractive for Canadian companies to buy American electric utility companies.  In 2016 alone, three companies were purchased: Fortis bought ITC, Algonquin Power bought Empire District Electric, and Emera bought TECO.  None of these were particularly large transactions, but I expect that those opposed to greenhouse gases ought to be cheering each one.  The reason they should be happy is because Canadian companies appear concerned about greenhouse gases and don't want to invest in technologies such as coal.  Once again, this is something that will work counter to the Trump policy of encouraging coal consumption.

#5: Get Utility Ratemakers to Provide Higher Rates of Return on Alternatives

            The fifth strategy has the greatest potential to reduce greenhouse gas emissions from electric utilities.  Let me explain the basis for this.  Electric utilities are considered monopolies, and the price of electric service is normally set by state public utility ratemaking commissions.  Such ratemaking is done by the states, as opposed to the Federal government.  The rate that a utility may charge its customers is, generally speaking, governed by the following formula:

            R/kwh = O + (V – D)*r

where R = the electric utility's revenue

            kwh = total kilowatt hours

            O = operating costs of the utility

            V = amount of invested capital in the utility

            D = depreciation on the invested capital

            r  = allowable rate of return

Ratemaking for commercial and industrial customers is a little more complicated, particularly because of "demand" charges, but this simple formula should convey the core of the process.

            The allowable rate of return is determined by the public utility commission of each state.  My idea is to encourage these commissions to provide differential rates of return based upon the type of investment.  The idea is to offer the utility higher rates of return on more desirable forms of investment (e.g., wind and solar) and lower rates of  return on less desirable forms of investment (e.g., coal).  If electric utilities can earn higher rates of return on wind and solar and lower rates of return on coal, more than likely they will invest more in renewables and less in coal plants.  Again, these decisions are made at the state level, not Federal.

          Besides the fact that certain forms of energy are more desirable than others, is there any type of economic justification for this?  I believe the economic rationale for differential rates of return is the hidden costs of greenhouse gases.  Various estimates have been made of the cost of greenhouse gases.  Some have calculated it as $ 37/ton whereas others say it could be as much as $ 220/ton.  Let's assume, for the sake of argument, it's $ 100/ton.  Let's further assume that an electric utility has the choice of building one plant that is renewable and the other which is coal.  The two plants are projected to produce the same amount of electricity, but the coal plant will generate 10,000 tons of greenhouse gases/year.  Based upon the imputed cost of the greenhouse gases, that represents $ 1,000,000 in costs related to the greenhouse gases.  The state could encourage the company to build the renewables plant and split the $ 1.0 million in foregone cost of greenhouse gases, or $ 500,000.  If the cost of the plant is $ 100 million, it could increase the rate of return on $ 100 million by 0.5% (i.e., 500,000/$ 100 million).  The company and its shareholders would make the identical capital investment but would earn just that much more each year.  Likewise, the general public would benefit from the reduction in greenhouse gases. 

Thus, through the ratemaking process, the state could provide incentives to the company to invest in more efficient technologies by adjusting the allowable rate of return on the investment.   Of course, some would object, saying that electric utility rates would go up.  That's true, but I would make the argument that it would be a sign that consumers are paying the real cost of electricity generation.  Up to now, they haven't paid the true cost because the costs of greenhouse gases have been ignored.  Carbon taxes are a "no no", but a proxy for carbon taxes might be acceptable.          Differential rates of return could serve as a proxy for a carbon tax.

None of these ideas ought to be considered a panacea.   Instead, the point is to get everyone who is concerned about greenhouse gases to stop whining about what the Trump Administration has, or hasn't, done about emissions.  In the long term – meaning every four years – the public can express its opinion about what the Administration is doing.  In between, those concerned about things like greenhouse gases can, and should, use each problematic governmental decision as an opportunity to seek out an alternative.            

            Instead of whining about what "should have been", or "could have been", try to reframe the problem and consider it an opportunity to seek an unexpected perspective, and an unexpected answer.

post a comment

To help reduce greenhouse gases, hybrid electric aircraft are being developed; and they may have some unexpected benefits.

            It seems as though it was just yesterday that the idea of electric powered vehicles was a pipe dream, yet today both hybrid electric and fully electric vehicles whiz down ordinary streets in every town.  The same may soon be true for hybrid electric airplanes.  Airplanes? 

            Fully electric airplanes are already a reality, though not many people have ever seen one, much less flown in one.  In the summer of 2016 a plane called the e-Genius set seven new records as it flew over the Alps.  It was built by a team from the University of Stuttgart in Germany and flew non-stop for 300 miles at a speed up to 142 miles per hour.  That doesn't sound particularly impressive, until you also find out that it climbed to 20,000 in under two minutes!  That's notable for most any plane, but this one was all-electric!

            As with automobiles and trucks, battery technology has improved dramatically in the past few minutes, creating the possibility of hybrid and all electric vehicles.  However, it's one thing to use electric power to propel an auto or a truck, and something quite different to power an airplane.  This is because while the critical limiting factor for batteries in an auto or truck is cost, the limiting factor for an airplane is weight. 

            The challenge is to improve what's called the energy density of the battery.  Energy density is the amount of energy/cubic unit of the battery.  An important question to consider is, how many miles can the plane fly per pound of fuel or per pound of battery?  Today, it's estimated that 1000 pounds of jet fuel can take an airplane 14 times as far as 1000 pounds of battery.  The wings and fuselage must lift and propel the same 1000 pounds, be it fuel or energy, so the critical question is, which one can move the plane farther?  Right now, it's a slam dunk for jet fuel.

            Batteries, of course, keep improving.  Reports of the annual improvement vary, but a commonly stated number is 2 to 3 %.  Some quick math says that aircraft won't be able to have viable electric power propulsion systems for another 30 years.

            Despite that math, several companies say they'll have commercial hybrid electric aircraft available in just about 5 years.  If the energy efficiency math is correct, how is that possible?  The answer has to do with an entire "rethink" of the air transportation system.  You see, the expected appearance of hybrid electric aircraft in the next decade could change much more than just the propulsion system on the typical aircraft; and it just might usher in the latest example of what Harvard Business School Professor Clayton Christiansen calls "disruptive innovation".  Let's consider how this could happen.

            The Boeing 737, the workhorse short and mid-range aircraft of the past 40 to 50 years, probably won't have hybrid electric propulsion for another 30 years, maybe longer.  However, if some start up aircraft manufacturers have their way, you'll get your first ride on a hybrid plane in the early 2020's.  No, it won't be a 140 – 190 seat Boeing 737, or a Boeing 777, Airbus 319, or Airbus 380, it will likely be a 12 passenger seat plane.  The manufacturer?  A start up called Zunum Aero, located outside of Seattle. 

            Zunum recently released the following specifications for the plane they hope to begin flight testing as early as 2019:

  • 12 passengers (compared with 130 – 200+ passengers for the 737)
  • Take off distance of 2,200 feet (6,000 – 7,500 feet for the 737)
  • Flight range of 700 miles (3,500 to 3,800 miles for the 737)
  • Cruise speed of 340 mph (520 for the 737)

In terms of straight up comparison, we're talking two completely different birds.  The Zunum hybrid plane sounds like a toy compared to the workhorse 737, so why would anyone be impressed?

            Zunum, and possible "cousins" being built by companies such as Wright Electric, could be highly disruptive because they create the potential for an entirely new aviation market.  According to Clayton Christiansen, the Harvard expert, disruptive innovation tends to occur at the bottom end of the market.  The products can't compete with the incumbents because they're too small and have too limited a feature set.  The main market, and the marketplace leaders, tend to ignore these innovators at the bottom end of the market.  Eventually, however, the new entrants at the bottom end of the market become real competition for the main market.

            So how could hybrid electric aircraft be disruptive?  It's the classic one word answer for disruptive products: cost.  Zunum projects that its 12 passenger hybrid will be able to operate at a cost of $ 260/hour.  For anyone associated with commercial aviation, that's an astounding number.  The commercial sector tends to think in terms of ASM's, an abbreviation for available seat miles.  That's the number of seats on the plane times the number of miles the plane will fly, and those in the industry use ASM as a key metric.  Zunum projects that its ASM will be eight cents!  That's about one tenth the cost of a typical business jet today, meaning that Zunum could reduce the cost of a certain segment of aviation by an order of magnitude.

            So just how might a company disrupt commercial aviation with the hybrid electric engine?  By creating a practical alternative to the "hub and spoke" system that major airlines use.  Most people who fly commercially are familiar with hub and spoke.  Imagine that you're like me, a regular customer of United Airlines.  I live in the Tampa Bay, Florida area and fly to lots of places.  When I get on a United Airlines flight in Tampa, invariably I will fly either to Houston, Washington, Newark, Chicago, Denver, or San Francisco.  Most of the time my final destination isn't one of those six cities, but I won't get to my final destination without connecting through one of those hubs. 

            Pretty much every major airline uses a "hub and spoke" system, so most every airline also flies its passengers through hubs.  They're highly efficient, and permit the average passenger to fly to a large number of destinations at comparatively low cost.  What's not to like?  Plenty!  The big problem with "hub and spoke" is that it makes the trip just that much longer, and increases the potential for delays, lost luggage, and every imaginable form of aggravation.  Instead of one unpleasant plane ride, you get two or three!

            So hybrid electric aircraft, with a dramatically different cost structure, could create lots of new possibilities.  One can see right away two great potential benefits:

  • Commercial flights from lots of additional airports
  • More direct flights rather than connections through a hub, meaning much shorter elapsed time from origin to destination
  • The potential to simplify the process of getting on and off a plane
  • Much lower cost.

Consider that today, only about 2% of airports have commercial flights.   The fact that these new hybrid planes can take off on a 2,200 foot runway means far more airports could have commercial flights.  Use of smaller aircraft, with a much lower breakeven cost, means the possibility of far more "point to point" flights.

            The idea of replacing "hub and spoke" isn't new.  Various entrants to commercial aviation have been trying to do this for years.  One very promising entry was DayJet, a Florida based airline startup in the early 2000's.  Unfortunately, it hasn't worked.  DayJet took off and very soon landed in Chapter 7 bankruptcy liquidation.  But DayJet couldn't benefit from the expected economics of some of these new hybrids. 

            Of course, the new planes are still under development, but here are some possibilities to consider:

  • A trip from San Jose, California to Los Angeles presently takes about 4 hours and 40 minutes when flying, and costs about $ 160. 
  • Zunum expects it can reduce that trip to 2 hours and 15 minutes at a cost of $ 120, a third less.  It isn't that Zunum's plane will be flying faster (it won't), it's that smaller airports can be utilized. 

Rather than fly through big airports like LAX in Los Angeles, why not go out to a small local airport, park your car, then just get on the plane, maybe even without going through TSA?  Sign me up!  That's always been the great appeal of private aviation, just that you had to have at least $ 20 million in your bank account to participate.  Smaller, slower planes such as the one Zunum is promising could provide the tortoise to commercial aviation's hare, to borrow from Aesop, and make this available to everyone else.

            Hybrid aviation should often one other important benefit not yet mentioned.  In fact, this other benefit has been the real driver of the industry: lower carbon emissions.  Aviation is a major contributor to greenhouse gases worldwide, and it's expected to get much worse over the next 30 years with continued aviation expansion.  Hybrid technology, then all electric, could have a major impact on aviation-caused greenhouse gas emissions.

            Please remember, these aircraft are still under development so don't plan on booking a ticket any time soon, unless you happen to be a test pilot.  But they could have a dramatic impact on aviation, not simply because they should produce significant reductions in carbon emissions, but mainly, and unexpectedly, because their dramatically different economics could really change flying.  They could, in the parlance of Clayton Christiansen, be "disruptive innovators".   Not quite, but soon, ready for take off.



post a comment

A look at a new book that prescribes a better way to take care of our minds, and the implications for all of us

            The USA faces numerous challenges, two of the very biggest being problems with our educational system and with healthcare.  The Organization for Economic Cooperation and Development (OECD) reports that the USA spends more than any other country on primary and secondary education (K-12) per student; and various studies show we spend far more than any other country on healthcare per person.   We somehow aren't spending our money very wisely, as our students perform comparatively poorly on standardized tests that are administered around the world; and despite widely available, fantastic healthcare technology – partly the result of spending more than twice as much per capita as any other country on health care - we rank 43rd in the world in life expectancy, even behind countries such as Cuba.

            Unfortunately, the stock answers for these problems are more spending and more programs: additional requirements for students in K-12, and more spending on healthcare, especially to pick up people who are somehow being left behind.  Given that the standard strategies only seem to be making things worse, not better, is it time to try something really different?  A book I recently read may help point us to a common solution for both of these problems – and so far as I know, very few people have considered its adoption as the key solution.  It proposes a better way for each of us to care for the health of our minds, and all of the attendant benefits of doing that.  It's not a panacea, but it could have a marked impact on both.  I'll explain that further below.

            The solution is aerobic exercise, coupled with other types of exercise.  The evidence underlying this idea is set forth in a new book called Spark: The Revolutionary New Science of Exercise and the Brain, written by Dr. John Ratey, a professor of psychiatry at Harvard Medical School.  Ratey is well known for his research in attention deficit disorder.  One of his previous books is Driven to Distraction, which addresses the problem of attention deficit disorder, including in adults.  Ratey's goal in Spark is to demonstrate the important link between exercise and the brain.   While anecdotal evidence has suggested a link between the two, up to now there hasn't been a great deal of scientific evidence.  Ratey reviews a great of scientific evidence produced in the past few years to build his case.  As he notes early in the book, it's already well known that inactivity is killing our bodies, but he demonstrates that it's also killing our brains.

            So what evidence does Ratey provide?  The first has to do with the importance of aerobic exercise to help improve learning.  He cites the examples of schools in Naperville, IL and Titusville, PA to demonstrate how changing a school's emphasis towards aerobic physical training can have a remarkable positive impact on the performance of children in schools.  While the evidence for Naperville is certainly very positive, one might tend to dismiss it because its in a fairly privileged community.  That's why the evidence from Titusville, PA is so instructive.  Titusville is a depressed community, but it achieved similar outcomes to those in Naperville.  The clear message is that instead of programs like "No Child Left Behind", we probably should have programs such as "No Child Left Aerobically Unfit".

            Ratey also points out the benefits of aerobic exercise for adults, too.  The benefits are not simply to improve health and one's waistline, as everyone already knows.   Ratey cites evidence that aerobic exercise is beneficial to adult learning.  One of the simple but great takeaways from the book is that one should do aerobic exercise before tackling any important mental tasks.  The aerobic exercise helps prime the brain to be its best.

            Aerobic exercise is certainly known to benefit cardiovascular health, but what's the role with the brain?  Fundamentally, according to Ratey, it's an issue of balancing neurotransmitters in the brain.  He provides fairly detailed explanations of the processes, but in layman's language so that it is approachable by readers who aren't medical doctors or neuroscientists. 

            Beyond education, however, Ratey demonstrates the importance of aerobic exercise to impact a range of health issues facing the country.  He builds the case that aerobic exercise can play a very important role in helping mitigate and treat:

  • Depression and mood disorders
  • Attention deficit disorder
  • Addiction to drugs, alcohol and smoking
  • Hormonal changes in women
  • Dementia
  • Aging.

            He isn't saying aerobic exercise is a panacea for dealing with each of these health issues, but that it can play a very significant role in mitigating them.  As an example, he points to the evidence that a consistent program can be just as beneficial as drugs like Zoloft in fighting depression.  According to Ratey, there's been a good deal of anecdotal of evidence, but now there is scientific evidence to back up what's been informally observed.

            My conclusion is that the application of his ideas could help create major improvements in the twin problems of education and healthcare in the USA.   Moreover, they could provide better outcomes for substantially lower cost.  Consider education first.  In the case of Naperville, IL and Titusville, PA, neither school district spent a lot of money to buy expense equipment or build fancy facilities.  Probably the major expenditure was to purchase heart rate monitors, the devices many athletes use to measure heart rate.  These typically are thin black straps that are fitted around the chest.  It also didn't involve hiring a lot of additional teachers, education specialists or administrators, either. 

            Yet the results were pretty dramatic.  In both cases, student performance on standardized tests improved fairly significantly.  Not only that, but teachers also no longer had to deal with as many behavioral problems with students.  It wasn't that the educational curriculum changed.  It wasn't the result of a fancy new approach to teaching, new textbooks, new computers, or other new systems.  It was simply getting kids to do various forms of aerobic exercise.  For many, it was running around a track, but that wasn't the only choice available to students. 

            With respect to health care, Ratey's proposed solution also doesn't involve a lot of expenditure.  If anything, it involves less expenditure.  Aerobic and related exercise replaces medication, in whole or in part.  For example, instead of medicating children and adults with ADHD, exercise is substituted, with exercise producing results as good as the medication, maybe even better.   In other cases, the aerobic exercise regimen provides a way to avoid other costs, for example:

  • The cost of treating addiction in the standard ways
  • The cost of treating dementia in older adults, because exercise can help stave off the disease for a longer period of time
  • The cost of treating the various diseases associated with aging, because exercise provides a way to help keep older adults healthy for a longer period of time.  Ratey is saying that exercise will prevent these diseases, but it can help delay the onset of symptoms, as well as reduce severity.

            What Ratey is suggesting is pretty simple, and should be pretty obvious, but unfortunately, it probably won't happen, at least not on any large scale.  That's because there are lots of institutional and other forces arrayed against this outcome.  No, it isn't because of some grand conspiracy, or even a series of mini-conspiracies, it's just that for Ratey's approach to be adopted, lots of institutional inertia will need to be overcome.

            For example, Ratey's strategy will result in far fewer pharmaceutical products being consumed, at least when it comes to trying to solve problems with the mind.  For lots of people, that would probably be a great thing, but the pharmaceutical industry certainly won't be amused by this.  After all, pharmaceutical companies are in the business of selling drugs, not sneakers: Roche is not Reebok, Abbott is not Adidas, and Novartis is not Nike.  Moreover, medical doctors are much more in the habit of prescribing pills, not exercise regimens; and even if they were, they still need to persuade their patients.  Which may be the ultimate problem: the average American finds it easier to pop a pill than to exercise.  Until that somehow changes, Ratey's strategy will stay in the realm of "that's a great idea, but not many people will do it".

            It may be different, however, for the education problem.  What Naperville and Titusville have done was pretty simple, and pretty inexpensive.  One could make an argument that teachers and administrators might oppose it, but once they see the outcomes, they're likely to want to join the bandwagon.

          Ratey would like to start a bandwagon, but one hasn't formed, at least not yet.  Why?  Most likely, because the Naperville and Titusville programs aren't particularly well known, at least outside a fairly small circle.  I think the reason may be because what Ratey is proposing is counter-intuitive – hugely counter-intuitive.  But it seems to work, not for magical reasons but for ones based upon the emerging science of exercise and the brain. 

            One of the surprises of Naperville and Titusville is that it wasn't that hard to get kids to participate … and that may lead to an overall strategy for both education and healthcare: get kids to lead the change.  Here's how that might work:

  • Kids getting into better shape, with an associated improvement in educational performance, as well as health
  • Teaching kids the relationship between exercise and brain function so that they grow up understanding the relationship
  • Getting kids to influence their parents.

We're accustomed to having older generations teaching the younger generation.  In this case, the reverse might come into play, with the younger generation leading the way. 

            If that happens, then it would eventually impact overall health and healthcare spending.  It would just take a generation or so.  If a generation of kids grows up understanding the relationship between exercise and brain health, then eventually there may be changes in healthcare as well.  No guarantee, but there's a real chance.  Until that happens, though, nothing is likely to change.  That's because adults already know they should exercise … and some do … but a huge percentage of the adult population doesn't.  It's still easier to reach for a pill made by Roche, not a pair of sneakers made by Reebok.  Not only that, but it's still easier for doctors to prescribe pills, even where the better prescription may be aerobic exercise.  

            On the other hand, we're talking about our brains.  What Ratey is proposing is a fundamentally different way to think about how to take care of our brains, and the positive impacts that could have in so many ways.  If people realize the impact of exercise on mental health, and everything connected with it, maybe they'll get more serious about exercise.

            Even if Ratey's strategy isn't widely adopted, it could still provide lots of benefits to individual schools and individuals dealing with health care.  It's a strategy that can benefit not only the well to do, but also poor people.

            Spark is a most interesting book, and thought provoking as well.  I encourage you to pick it up.   

post a comment

Drones are coming to the rescue in hurricanes, earthquakes and forest fires. They're coming to an even bigger rescue.

            Disasters sometimes have a bright side.  Given all of the recent news about hurricanes, earthquakes, and forest fires, a little bit of good news would certainly be welcome. 

Out of all of this destruction, what could possibly be good news?  Actually, it has to with unmanned aerial vehicles (UAV's), aka "drones".  The good news is they're helping with hurricane and earthquake relief.  The better news is that they're pointing to something else: an even bigger and more important rescue.

            Drones seem to be showing up in all kinds of places.  Hobbyists are the biggest buyers of drones, as approximately 94% of drone sales are in the hobby market.  This is largely due to the tremendous reduction in the price of drones.  You, yourself, may have given or received a drone as a present last Christmas.  The whirring sound of hobbyist drones is increasingly common in parks and neighborhoods around the country. 

            All of these drones have created a range of different problems.  Many, including the Federal Aviation Administration, are concerned by the proliferation, and there's a race to get up to date regulations in place.  There's also been a battle over whether states and localities can set different, stricter rules regulating drones.

            While hobbyist drones are proliferating, so are commercial and industrial drones, just on a different scale.  Only about 6% of drone sales are commercial and industrial, yet that 6% actually represents 60% of the dollar value of drone sales.  That's because the commercial and industrial UAV's are far more sophisticated, and far more expensive, than their hobbyist cousins found in the typical family garage.

            Usage of commercial drones has definitely been growing, but the recent spate of hurricanes, earthquakes, and fires has revealed a potential huge area for these drones – disaster relief.  Just since Irma, the FAA has issued 132 authorizations to use drones in hurricane relief in Florida alone.  Let's consider how they've been used in the wake of Hurricanes Harvey, Irma and Maria, as well as recent earthquakes and fires.

            After Hurricane Irma, the Air National Guard started using drones to conduct aerial surveys, deciding where relief is needed most.  Historically, the task of surveying damage has been painstakingly slow, done on a block by block basis.  Drones can have a hugely positive impact on this.  The Federal Aviation Administration has been quite wary of drones because of air safety concerns.  Yet FAA spokespersons have recently remarked that drones are playing "an invaluable role" in Hurricane Irma relief.  FAA Administrator Michael Huerta recently said, "I don't think it's an exaggeration to say that the hurricane response will be looked back upon as a landmark in the evolution of drone usage in this country."

            In the wake of Irma, the US government's Customs and Border Patrol agency has used drones to map Key West, Miami, and Jacksonville, each significantly affected by the storm.  This mapping process is obviously much faster than the traditional method.

            The Red Cross has been using drones to facilitate relief efforts in the wake of Hurricane Harvey in Texas.

            Besides relief agencies, insurance companies are also beginning to use drones.  For example, Airbus Aerial, a division of European jet maker Airbus, has a division that is deploying drones on behalf of insurance companies.

            A Canadian company called The Sky Guys has been using some its drones to help with mapping after Hurricane Harvey.  Sky Guys typically deploys drones to do infrastructure construction work, so putting some of their drones to work in the Harvey cleanup is a bit of a departure, but one that seems to be working well.

            Another application of drones is to provide emergency mobile phone and Internet service.  Many victims of Hurricanes Harvey, Irma and Maria have contended with knocked out telecommunications.  These days, that's one of the biggest problems.  One of the solutions is to launch drones with 4G LTE service.  Drones are launched and "tethered in place".  It's a bit unorthodox, but who cares if your mobile and Internet service are out and a drone restores it?

            How rapidly is the market for drones growing?  Based upon research by Goldman Sachs, Gartner, and PriceWaterhouseCoopers, very rapidly indeed, and not just for hobbyists!  Gartner says the market grew 60% just from last year to this year.  Goldman Sachs recently issued a report saying that by 2020, the market will be $ 100 billion.  The consumer market will grow to $ 17 billion, but that's dwarfed by the military market ($ 70 billion) and what Goldman calls the "Commercial/Civil" market ($ 13 billion).  PwC goes farther, estimating the worldwide market in 2020 will be $ 127 billion.  These are, of course, merely estimates, but they strongly suggest a very fast growing market.  Given the recent application of drone technology to disaster relief, the estimates may even be understated.

            Besides the interesting fact of these new, innovative uses of drones in disaster relief, I bring this to your attention because it is another example of how technology can once again come to the rescue (no pun intended).  Drone technology is certainly helping with disaster relief in new and innovative ways.  The other reason for pointing this out is as a counter to all of the recent talk about how robots will be bringing forth Armageddon.

            It's hard to find a day when there isn't another news story proclaiming that robots will leave huge numbers of people unemployed.  Robots, like drones, are clearly proliferating.  Unfortunately, the National Bureau of Economic Research, an arm of the US government, recently reported that 6.2 jobs are eliminated with each new robot.  PriceWaterhouseCoopers projected in another recent report that 38% of current jobs could disappear within 15 years, a higher percentage than what's projected for other major economies.

            On the surface, it sounds pretty bleak for lots of American workers, but is it really Armageddon?  I don't think so, precisely based upon the example of drones in disaster relief.  We keep expecting that jobs will disappear – and they do – but we fail to appreciate that new technology also creates new products and services, applicable in new markets.  Ten years ago, if someone told you that drones would be used in disaster relief, you might have laughed, or at least you would have been skeptical, but you probably aren't now.

            Besides hobbyist applications, how many different ways can drones be used?   Goldman Sachs identified the following as significant opportunities for growth:

            Construction               Agriculture                  Insurance claims

            Offshore oil/gas          Police                          Fire

            Coast Guard               Journalism                   Customs/Border Pat

            Real estate                 Utilities                        Pipelines

            Mining                        Clean energy               Cinematography


No doubt, if you spend some time thinking about it, you probably can come up with all of other applications. 

            Growth in the UAV/drone industry will doubtless result in large numbers of new jobs.  Many will be related to design, manufacture, sales, distribution and servicing of the drones, themselves, but the vast majority will probably be the industries that put the drones to new, imaginative uses.  Think about all of the people who will be employed in the 15 categories identified by Goldman Sachs for drones? 

            But what about the robots?  Unquestionably, lots of jobs are going to disappear because of robots, but we are not on the verge of Armageddon.  Here's why.  Thomas  Malthus was probably the first to predict that economic and population growth would lead to disaster.  Malthus lived in the late 18th and early 19th centuries in England.  He was a keen student population and made some important observations.  For example, he said, "Population, when unchecked, goes on doubling itself every 25 years or increases in a geometrical ratio."  He was fairly accurate on that.  But from that he concluded, "The superior power of population cannot be checked without producing misery or vice." 

            His predictions haven't come to pass.  Likewise, the doomsday predictions of the intellectual descendants of Malthus – Paul Ehrlich, the Club of Rome, and others - have not come to pass.  The reason is because Malthus and others didn't factor in technological changes and improvements.

            It isn't that technology is a panacea, or some form of salvation, it's just that its application repeatedly results in new and unforeseen improvements that avoid or overcome the disaster scenarios that Malthusians have predicted.  One byproduct of this is that jobs are created in surprising and unexpected places.  The application of drones in disaster relief is but the most recent application of a seemingly timeless concept.  The same with all of the other new applications for drones.

            So robots, themselves a new technology, will probably keep proliferating and destroying jobs.  But other new technologies, such as drones, will create new applications, new industries, new products, and new jobs.

            Drones are coming to the rescue in the aftermath of Hurricanes Harvey, Irma, and Maria, as well as the Mexican earthquakes.  Great news!  They're also coming to the rescue, creating some of the new jobs that will replace the jobs lost to robots. 

            Instead of bemoaning the loss of jobs to robots, I think we should concentrate on asking, what are the most promising new technologies available; how can we encourage the deployment of those technologies to solve problems (e.g., disaster relief), and how can job growth be fostered?  Besides all of the benefits of these technologies, we'll end up with the side benefit of erasing job losses due to robots.  At the same time, just as we focus attention on helping hurricane and earthquake victims recover, we should focus attention on helping robot job loss victims to get retrained so they can take on the jobs of the future.

            Yes, drones in disaster relief: coming to the rescue in more ways that one.

            Please feel free to share your comments.       

post a comment

A review of an interesting new book about 25 scientists and theologians who changed their minds and embraced evolution, all the while maintaining their Christian faith.

I've never had to change my mind about evolution, as I can't ever remember a time I doubted it, but lots of people can't say that.  I've just completed a most interesting book titled How I Changed My Mind About Evolution.  It includes brief personal stories from some 25 scientists and theologians. These people are all committed Christians who previously were skeptical about evolution, but who have now come to embrace it.  In the process of embracing evolution, they continued to affirm their Christian beliefs.

The book is edited by James Stump and Kathryn Applegate, both of whom are connected with Biologos, an organization that endorses the idea that the findings of modern science and the Christian Bible are completely compatible.  Biologos, and those who hold similar views to the organization, often describe this as evolutionary creationism.  This is not the creationism that says the Earth is no more than about 6,000 years old and that humans were specially created by God, not through the process of evolution by natural selection, and that the Biblical Book of Genesis is literally true.  Instead, it fully embraces Darwinian evolution by natural selection.  In that sense, it is identical to the beliefs of people like Richard Dawkins, except that supporters of evolutionary creationism believe that the world was created by God in the Bang Bang, and that the process of evolution is, and always has been, ultimately under the control of God.

Many of these stories are intensely personal, and quite a number of the contributors share very sad tales.  A number are university academics, possessing doctorates in a broad range of fields, and who pursue advanced research.  We like to think that universities are places where competing ideas are shared.  Unfortunately, that wasn't the case for some of the authors, who reported that their colleagues were often close-minded and couldn't understand how any of their colleagues could simultaneously believe in scientific concepts and maintain strong Christian beliefs.  The very same people then reported that fellow members of their churches were equally skeptical that they could be Christians and yet harbor beliefs in evolution by natural selection – a sort of reverse close-mindedness.  In other words, they were greeted with skepticism and mistrust both at work and at church!

Stories with personal drama are often attractive, but why should anyone be interested in the ones recounted in this book?  Having read the book, I think three distinct groups ought to take a look at it.  Surprisingly, the first group includes those who are not Christians.  Why would non-Christians, particularly those who strongly endorse Darwin's theory of evolution, care about a book like this? 

The reason is to help overcome a popular stereotype that Christianity and modern science are incompatible.  This is an idea that has been fostered on one hand by militant atheists such as Richard Dawkins, but also by many fundamentalist Christians who are skeptical of Darwin.  The stories in the book show this simply isn't the case: people with serious scientific credentials can simultaneously endorse ideas such as Darwin's theory and that the universe is 13.8 billion years old.    Many of these are scientists possess very serious credentials.  As an example, Francis Collins, best known as the head of the Human Genome Project, and also the founder of Biologos, has an essay in the book.  Collins is a highly respected scientist who fully embraces Darwin and is a committed Christian.  Another profiled in the book has a PhD in Astronomy from MIT while still another has a PhD in computational cell biology.  No intellectual slouches in the bunch!  Moreover, as noted by one of the contributors, "While [Richard Dawkins and other militant atheist writers] are persuasive, what many readers fail to see is that they misuse the authority of science (the study of the natural world) to claim that belief in the supernatural is irrational."  The notion that you either believe in modern science or you believe in religion is ultimately a cartoonish notion and certainly overly simplistic.  So this book can provide atheists a different perspective, maybe even food for thought.

A second group that should find this book interesting is people who maintain Christian beliefs and also accept Darwin's theory of evolution by natural selection.  I include myself in that group.  For this group the book should be worthwhile if for no other reason that to read the stories of others who have struggled with the issue.  Doubtless, many who are now able to reconcile science and the Bible have faced their own struggles and will recognize similar stories to their own.  The 25 contributors to the book possess a range of views on this subject. 

The third group is the key one to whom the book is directed: Christians who are skeptical of Darwin's theory of evolution by natural selection.  A common experience for the authors was to grow up in a Christian home, exposed to a world that was anti-evolution.  Problems started to arise when these writers became exposed to modern science and found it challenging to reconcile modern science with their Christian beliefs.  Ultimately, each of the contributors found a way to reconcile beliefs, but also pointed out that many others simply couldn't and, as a consequence, lost their faith.

One of the contributors observed that this is a major problem facing churches today: young people grow up in the church, lacking exposure to modern science, then are thrust into the world of universities and popular culture and find themselves un-moored. That was the experience of many of the 25 contributors.   Unfortunately, a very high percentage of 18 to 30 years who grew up in the church are leaving, quite often because of this issue.   Many of the 25 stories are of people who actively fought against Darwin's theory, or who actually lost their faith when they discovered that scientific evidence of evolution ran counter to their faith.  In each case, however, they ultimately found a way to regain their Christian faith, as well as to embrace Darwinian science.

Several of the contributors noted that a core problem is that so many Christians grow up in a world that is seemingly walled off, and one where there is never a serious discussion about modern science and how it might relate to religion.  Increasingly, we all seem to live in our own "filter bubbles", the world of evangelical Christians merely being one.  Atheists appear to have their own "filter bubbles", too.

The writers lament this state of affairs and hope that it will change, but I personally don't expect that to happen.  It isn't because evangelical Christians are ill-educated or stupid, as some might like to think.  That simply isn't the case.  Instead, they resist modern scientific findings about evolution and the age of the Earth because they sincerely feel that modern science is at war with them.  Even a casual reading of Richard Dawkins, Daniel Dennett, and Sam Harris will lead one to conclude the evangelicals aren't just imagining that.

So if the real problem is that evangelical Christians are seemingly close-minded about modern science, how might that resistance be overcome?  Ultimately, I believe it comes to down to a matter of providing a "welcoming environment."   At present, every Christian who rejects modern science as seemingly anti-Biblical has to reach two conclusions: 1) that they've been wrong about the science all this time; and 2) there is a way to reconcile their Biblical beliefs with modern science.  For an awful lot of them, that's a very tall order.  They're being asked to do something that in their minds is very difficult, but they haven't been given a reason they should want to do this.  From their perspective, there is no benefit, at least no perceived benefit, to make the change.  Until Christians who reject modern science as seemingly anti-Biblical can be offered a more "friendly environment", meaning reasons why they would benefit from changing their minds, they're likely to remain highly resistant.   Which then means more and more people, especially younger ones, will reach the proverbial "fork in the road": clinging to their faith or rejecting faith to accept modern science. 

So if that's the case, what might the "benefits" be to an evangelical skeptic of Darwin making a change?  Up to now, the only reason they've been given is, the science is good.  In their minds, that hasn't been compelling. 

I think there is a much better argument.  Instead of saying, believe in evolution because the science is absolutely compelling, I say, believe in it because evolution through natural selection can be used to reinforce two key and distinctive concepts in Christianity.  The first is that mankind is sinful, and that the proclivity for sin has been transmitted down through the generations to every human.  That's always been a core Christian belief, but there really hasn't been any hard evidence of it.  My argument is that evolution actually provides a very reasonable explanation for sin.

The second core idea is that humans cannot overcome this sinful behavior.  In other words, we cannot through our own efforts truly improve ourselves.   We have to depend upon God.  Again, physical evidence for this has been scant.  And again, I believe that modern science, through Darwin's theory, can be used to demonstrate the reality of this idea.

These two ideas form the core of what makes Christianity distinctive.  Therefore, the benefit for Christian skeptics of embracing evolution is the idea of real world evidence that those doctrines are more than just faith.  These ideas are discussed more fully in my book, The Unexpected Perspective

How I Changed My Mind About Evolution is definitely worth your time, and I encourage you to read it, whatever your particular perspective on modern science and religion.

post a comment

There's a lot of concern that climate change is making hurricanes more frequent and worse. That may be the case, but climate change isn't the place to focus attention if you're trying to reduce the impact of hurricanes.

            The incredible destruction wrought by Hurricane Harvey on Texas, as well as that from Hurricane Irma – a disaster that is still in progress as I write this – reminds us of the unbelievable havoc and misery that hurricanes and tropical storms can wreak.  The fury accompanying these two storms has raised an obvious and important question: is climate change making hurricanes worse; and isn't this an important reason to take action on climate change?

            I definitely believe in human-induced climate change, and I also strongly suspect that climate change may well be making hurricanes at least somewhat worse.  But if we want to try to reduce the tragic impact of hurricanes, focusing on climate change is at best a distraction in the effort.  Let me explain how I come to what is probably an unexpected conclusion.

            Before going any further, let's consider why climate change might be making hurricanes and tropical storms worse.  The two key reasons are water temperature and water vapor in the air.  Hurricanes gain their energy from warm ocean temperatures.  In fact, a hurricane can only form if the water temperature is at least approximately 80 degrees Fahrenheit (26.6 degrees Celsius).  It can only be sustained with warm water temperatures.  The warmer the temperature, the greater the chance of a hurricane forming and/or strengthening.  Global warming certainly appears to be increasing water temperatures.  At the same time, higher temperatures tend to increase the amount of water vapor in the air, something else that helps nurture a hurricane and make it more destructive.  So other things being equal, global warming may well be contributing to the problem both of the number and intensity of hurricanes and tropical storms.

            Yes, but it isn't so simple.  Let me explain why.

            First off, even if we could somehow end the problem of global warming and associated climate change, it's not clear what impact there would be on the number of hurricanes or their intensity.  We know this for no other reason that there were intense hurricanes before there was evidence of global warming.  In fact, since the start of the 20th century, the USA has experienced a Category 5 hurricane about once every 25 – 30 years: one in 1900, one in 1935, one in 1961, one in 1969, one in 1992, and now one in 2017.  Category 4 hurricanes are an even more frequent occurrence.  Table 1 below shows a list of the most intense Atlantic basin hurricanes over the past century.  Hurricanes such as the 1900 Galveston storm, the 1935 Florida Keys storm, Carla, and Camille were likely just as intense as Irma and Katrina, and all occurred before global warming was an issue.   So solving the global warming problem is certainly not going to eliminate these hurricanes.  It may reduce the frequency, but even that isn't clear.


Table 1: Past Category 4 and 5 Hurricanes




Windspeed (Miles/Hour)
















Mitch (did not hit USA)






Florida Keys



Gilbert (did not hit USA)









Galveston hurricane


Unknown Cat 5



            But the intensity of the hurricane really isn't the thing we should be worried about anyway.  Instead, deaths and injuries, as well as the resultant damage, are the real concern.  After all, there have actually been a number of extremely intense hurricanes in the Atlantic that never touched land.  Nobody remembers the names of those storms, and nobody really cares.

            So which storms have actually been the deadliest and costliest?  The deadliest by far was the 1900 Galveston hurricane, which killed an estimated 6,000 people.  They had virtually no warning on that one.  Fortunately, modern technology has helped to provide better warning, with much less loss of life.  The 1926 Miami hurricane killed 372 people, mainly because people didn't understand the calm of storm's eye is but a precursor to another round. 

            Then there's property damage.  Table 2 shows a list of the most costly hurricanes and tropical storms.  One interesting thing to note is that amongst the costliest were storms that weren't intense.  In fact several of them – Tropical Storm Allison and Superstorm Sandy - weren't even hurricanes.  They did incredible damage, however, and besides fatalities and injuries, that's what really gets our attention.

Table 2: Costliest Hurricanes/Tropical Storms




Estim Cost (Billion USD)







Tropical Storm Allison
























Superstorm Sandy










            Our real concern shouldn't be how intense the storm is, it should be how much loss of life (and injuries), as well as the damage.   To deal with those, there are three things we can focus on.  Let's consider each of them.

            The first is the technology associated with tracking storms and predicting where they'll go.  The 1900 Galveston hurricane killed so many people because there was little technology to track the storm and warn people to get out.  We can and should continue to improve this technology, but we're not likely to have much impact here.  Yes, we can build ever better weather satellites and sensors, but such improvements will probably have only marginal impact.

            Instead, we should give greater attention to the second area where we can improve – building technology and building codes.  The destruction caused by Hurricane Andrew in 1992 led to a detailed review of building codes and practices.  They were strengthened significantly, especially with respect to window and door technologies, as well as methods to insure that roofs won't blow off.  Homes and businesses built since 1992 are far more likely to survive an intense hurricane, thanks to the Andrew-induced changes.  More obviously can be done in this area, particularly in retro-fitting existing structures.

            While spending on hurricane tracking and building technologies can help save lives as well as reduce property losses, there is a third area that will yield substantially greater reductions in deaths, injuries and property damage … and it has absolutely nothing to do with global warming or technology.  Instead, it has to do with zoning and insurance.

            The biggest single danger in a hurricane or tropical storm is storm surge.  The low barometric pressure associated with a hurricane causes the ocean to rise at least a few feet.  The lower the air pressure, the greater the surge.  How do you avoid this problem?  By either not building structures in low lying areas adjacent the ocean, or building the structure high enough that storm surge passes underneath the structure.

            This isn't some great new revelation – it's been known for at least fifty years.  The other thing that's been known for many years is what areas are susceptible to storm surge and flooding.  So you may ask, if we know that storm surge is a problem, and we also know where it could be a problem, why haven't we solved the problem?

            The answer, unfortunately, is that we don't want to acknowledge the problem.  Not only that, we take active measures through our government to make the problem worse.   Let me explain how, and why.

            We have pretty detailed maps that show what areas in the country will flood, as well as the estimated frequency of flooding.  This is quite well known for coastal areas, especially low lying coastal areas.  You may ask, if we know the relative frequency that these low lying coastal areas will flood, why do we build structures in those areas?

            It's a good question.  Some say we shouldn't build structures in low lying coastal areas for this very reason.  One way to solve the problem is through property insurance.  Unfortunately, about fifty years ago, property insurers concluded that flood insurance simply wasn't a good product to sell.  This is because the property insurers calculated they would have to pay out too many claims and wouldn't be able to make money.

            To rescue came the US government, which decided to provide insurance companies guarantees for the flood insurance policies they wrote. This helped foster the development of property in flood prone areas, including areas subject to hurricane storm surge.  Lots of people were happy about this – property developers, because they could build beautiful beach front developments; and buyers.  So what could go wrong?  Plenty.

            Remember that the reason the Federal government started guaranteeing flood insurance policies was because the private market wasn't working.  By getting involved in flood insurance, there have been a whole host of unintended consequences.  The key one is that a huge amount of development has occurred in these flood prone areas.  Every time a hurricane or tropical storm hits, huge claims need to be paid.  The real reason the costs in Table 2 are so high is because these storms did serious damage to structures that were principally located in flood plains.   The 1926 Miami hurricane, a pre-global warming storm, killed lots of people and did a lot of damage.  If the same storm occurred today, it's estimated it would cost $ 164 billion in damages.  This is because of so much development, as well as lots of it in flood prone areas.

            Unfortunately, the problem just gets worse, because we keep permitting development in known flood plains; and that development is backstopped the Federal government.

            We probably can't do much about reducing the number of hurricanes and tropical storms we have, at least in the short run, but we can do something about building structures – especially expensive structures – in known flood plains.  If we curtailed the number of structures in flood plains, we're likely to reduce storm induced damage there.

            We could materially reduce the terrible cost of hurricanes by focusing on items two and three (i.e., improving building codes, zoning,  and reducing the amount the amount of construction in flood plains.

            Here's the really good news about this.  It can all be done without the Paris Climate Accord … without developing any new technology to reduce carbon emissions … and without worrying about who is the President of the United States.  Much of it can be done without even spending money. 

            If it is indeed that easy, why hasn't it been done?   Quite simply, because there are lots of incentives to build structures in known hurricane flood plains, but not enough dis-incentives to prevent this from happening.  The incentives are obvious: buildings near the sea are highly desirable.  Economic development of the beach is highly attractive for lots for people.  The disincentives are far less obvious.  The big disincentive – paying out Superstorm Sandy size insurance  claims – just isn't a disincentive until it happens.

            What realistically can be done?  At one extreme, we could stop all development in flood prone areas.  Pretty draconian, but that would reduce the problem going forward.  At the other extreme, we could end all Federal flood insurance guarantees and just let the marketplace sort out the risk.  This solution would save taxpayers a lot of money, but it would create problems, especially for lower income groups.  Moreover, it would be very unpopular with those whose insurance is presently being subsidized.  Any way you look at it, there are tough choices to make.  The key point, however, is that these are the real decisions that need to be made if we want to reduce the cost of hurricanes.

            This problem isn't limited to construction of properties that are in storm surge prone areas.  The case of Houston and Hurricane Harvey is instructive here.  The impact of Hurricane Harvey on Houston was not related to storm surge.  Even though Houston is a good distance from the Gulf of Mexico, it still has numerous areas that are prone to flooding.  Yet there's been lots of development in those areas thanks to government backed flood insurance. 

            The other thing about Houston is that flooding is a recurring problem.  I personally experienced in 25 inch rainstorm in Houston one day in the summer of 1976.  It had absolutely nothing to do with a hurricane.  The flooding was horrendous.  There have been numerous other floods since.  The problem is exacerbated by poor soil, excess construction, and inadequate zoning – all problems which are understood, but for which not enough has been done. 

            Hurricane or no hurricane, these are costly and deadly problems that need to be prevented.  My point is that one can superficially cite global warming and climate change as the cause, but by doing so one obscures the real problem: building in flood plains and inadequate building codes.

            So while it's important to deal with global warming and climate change, let's not let that be an excuse.  When it comes to problems like hurricanes, lets focus attention on solving the real problems.   

            Please share your thoughts, whether you agree or disagree.  Thanks for reading.   

post a comment

The long held assumption is that life first appeared on Earth. But what if life actual predates Earth, and life forms somehow were transported here after Earth's formation?

            Many people readily accept the idea that Darwin's theory of evolution by natural selection applies on a micro scale, meaning at the level of bacteria and viruses, and maybe even to some extent at the level of species.   While they accept these ideas, they reject the idea that Darwin can explain the evolution of life from its most basic forms up to humanity, meaning that while microevolution is real, macroevolution is not.  The argument is that supporters of macroevolution have stretched the available data and have "overplayed the hand".

            One of the key arguments that skeptics of macroevolution have used is that there simply wasn't enough time to explain the appearance of organisms as complex as bacteria and viruses.  The argument hinges on the evidence that the Earth is about 4.5 billion years old, and the amount of time between the formation of Earth and the emergence of bacteria and viruses is therefore too short.  Implicit in this is the idea that life must have developed "from scratch" here on Earth.  But what if that's a bad assumption?

            Two scientists who have called this assumption into question are Alexei Sharov, a staff scientist at the National Institute of Aging, and Richard Gordon, a Theoretical Biologist at the Gulf Specimen Marine Laboratory in Florida.  Sharov and Gordon use a novel way to estimate when life first appeared.  As a proxy for the complexity of life, they consider the number of base pairs in an organism.  More complex organisms have more base pairs than less complex organisms. They observe that number of base pairs of organisms has increased at an exponential rate over time, much like Moore's Law. 

           In 1965, Gordon Moore looked at the number of transistors on a computer chip and noted that it was doubling every 18 to 24 months.  "Curve fitting" just four data points (1962 – 1965), he projected that this exponential growth, referred to as Moore's Law, would continue into the future.  In the original paper his projection only went 10 years into the future – to 1975.  His ten year projection has taken on a life of its own, and for the past 50 years, his projection has proven accurate.  Sharov and Gordon use this as a model and suggest a "reverse Moore's Law".  If you look at the historical data for the number of transistors on a chip, you could project backwards to when there were only a handful of chips, all the way back to 1959, the starting point for Moore's original curve fitting graph.  For example, if one looks at number of transistors on a chip at various points from 1995 to 2015, then one could "reverse project" that there were only a few transistors on a chip back in the 1950's.  The "reverse projection" would be quite accurate.   

            Sharov and Gordon apply this line of thinking and do a similar "reverse projection" for genetic complexity (see the chart above).  They look at the time that various organisms (e.g., prokaryotes, eukaryotes, worms, fish and mammals) emerged, and plotted those dates against the genetic complexity of each type of organism.  Eukaryotes and prokaryotes are both organisms with cell membranes, but eukaryotes also have a nucleus.  Their "reverse projection" suggests that "genomic complexity of zero, meaning just one base pair of nucleotides", would have occurred approximately 9.75 +/- 2.5 billion years ago.  That's well past the date of the Big Bang (approximately 13.8 billion years ago) but also well before the formation of Earth (about 4.5 billion years ago).  Even at the outer lower bound, Sharov and Gordon say that life emerged 7.25 billion years ago, still well before our Earth formed.

            Thus, Sharov and Moore's proposal could address the objection that many have raised about the appearance of life on Earth.  One might argue that life could not realistically have arisen with 500 million years of the formation of Earth, but 5 billion years is more than realistic.

For Sharov and Gordon's theory to be realistic, two key questions need to be answered.  First, could life have begun from only one nucleotide base pair?  Second, if life began before the formation of Earth, how did early life forms survive travel through interstellar space and arrive intact on Earth?

            With respect to the first question, Sharov and Gordon present a theory based upon what they call coenzyme like molecules (CLM's).  Their model is hypothetical, but is certainly not out of the question.    The core idea is that CLM's could be a realistic precursor to the nucleotides A, C, G, and T that underlie genetics.  Sharov and Gordon hypothesize that CLM's existed in a hydrocarbon microspheres.  These hydrocarbon microspheres could have created a realistic environment for nucleotides to emerge.

            Assuming the original nucleotides emerged about 9.75 billion years plus or minus a couple billion years, somewhere in the universe, how did those nucleotides traverse interstellar space?  If that question cannot be adequately answered, whether or not the original nucleotides did emerge at the time hypothesized by Sharov and Gordon, then the idea of life emerging elsewhere in the universe and being transported to Earth is effectively moot.  Sharov and Gordon cite the research of L.H. Lambert and others that staphylococcus succinus was extracted from Dominican amber.  The spores had been dormant for 25 to 35 million years.   At the same time, Sharov and Gordon cite research by Richard Gordon and R.B. Hoover that "remnants of planets from exploded supernovae can carry billions of bacterial spores and maybe even active chemosynthetic bacteria deep beneath the surface."  In other words, bacterial spores could have been buried in interstellar material, laying dormant for possibly millions of years, then revived in another world.  Sounds somewhat far-fetched, but not necessarily unrealistic.

            If Sharov and Gordon are right, then the idea that genetic diversity follows a Moore's Law type of curve isn't far-fetched at all.  Moreover, it could overcome the perceived problem that bacteria and viruses could not have formed on Earth because of the short time period from the formation of the Earth until their appearance.

            What, then, of the idea that life emerged 9.75 billion years ago, about 5 billion years before Earth formed?  The reason this isn't necessarily a crazy is because the universe appears to have as many as 10,000,000,000,000,000,000,000 (that's 10 to the 22nd power) stars like our own.  While only a very small fraction of those stars are likely to have had planets with conditions that could have supported the emergence of life, the sheer number of possible candidates makes this a very realistic scenario.  Assume, for a moment, that there was only a one in a trillion chance that any particular star could have had a planet capable of supporting life of some sort.  Even if that is the case, there would still be approximately 1,000,000,000 (one billion) stars capable of sustaining life.  If it was a one in a quadrillion chance, then approximately one million stars have planets orbiting them that are capable of supporting life.

The Big Bang occurred about 13.8 billion years ago.  Assuming Sharov and Gordon are correct, then the first life forms appeared about 4 billion years after the Big Bang.  Four billion years should have been adequate time for life forms to have emerged.

            Assuming this is the case, were the life forms that were transferred to Earth advanced and intelligent?  The idea that Earth was seeded by intelligent life (sometimes known as "directed panspermia") is fairly well known.  Sharov and Gordon reject the idea that the Earth was seeded by intelligent life.  This is because they believe it would have taken at least 10 billion years for intelligent life to have formed.  Assuming the Big Bang really did occur 13.75 billion years ago, then even if life formed within a billion years of the Big Bang, at the time of Earth's formation (4.5 billion years ago), then life could only be about 8 billion years in age.  Sharov and Gordon contend that it would have taken at least 9 or10 billion years for intelligent life to form (refer back to the chart above), thus it would have been impossible for the Earth to have been seeded by intelligent life.

            Non-religious people should have absolutely no problem with Sharov and Gordon's theory, but can the same be said for Christians?  I really don't think it should create problems for most Christians.    

            Young earth creationists (YEC) will definitely have a problem with the theory, but anyone who is a YEC would have problems with any theory suggesting that the Earth, much less the universe, is much older than about 6,000 years.  Young earth creationists believe in a literal interpretation of the book of Genesis.  On the other hand, old earth creationists and evolutionary creationists (the latter being, like me, those who believe that God created the world using Darwin's evolution by natural selection) should have no problem with the theory. 

            The Bible says that God created all life, but it doesn't say where or when it happened.  The assumption has always been that life was created on Earth, but it doesn't specifically make that statement.  For most of history, most everyone assumed that life was created on Earth, but no one was aware of the sheer size and age of the universe, and no one was aware of the genetic curve calculated by Sharov and Gordon, suggesting that life began about 9.75 billion years ago.

            At this point, Sharov and Gordon's analysis doesn't prove or disprove anything, but I believe it is useful because it helps reduce constraints on our thinking about how life emerged.  For the longest time, we've constrained ourselves to the assumption that life had to have begun on Earth, not somewhere else.  The available data have not always

fit this model well.  Eliminating the constraint creates the possibility of other alternatives.  At the same time, it also doesn't provide any more evidence that life spontaneously emerged, the claim of many atheists and non-theists.

            If anything, the argument made by Sharov and Gordon should be encouraging for Christians who believe that Darwin's theory of evolution by natural selection is correct.  This is because it provides a way to overcome the objection that life could not have emerged on Earth according to Darwin's theory because of the relatively short time between the formation of the Earth and the emergence of life. 

post a comment

Buy the Book Now

Westbow Press · Amazon · Barnes & Noble

Get Carl's Updates In Your Inbox

Subscribe to our free e-mail updates and receive a free chapter from his latest book, The Unexpected Perspective.

Carl Treleaven is an entrepreneur, author, strong supporter of various non-profits, and committed Christian. He is CEO of Westlake Ventures, Inc., a company with diversified investments in printing and software.


© 2016 - 2018 Unexpected Perspective - All Rights Reserved.