Welcome to WordPress.com. This is your first post. Edit or delete it and start blogging!
Welcome to WordPress.com. This is your first post. Edit or delete it and start blogging!
I’d like to share my Snapfish photos with you. Once you have checked out my photos you can order prints and upload your own photos to share.
Click here to view photos
Essentially there were four entities: brokerages which sold financial products (mostly to the public), investment banks (specializing in proprietary trading of various sorts), investment banks (specializing in underwriting and mergers and acquisitions), and commercial banks. Commercial banks offered checking accounts (at cost), savings accounts (on which the interest offered was strictly regulated by Reg Q.), and commercial loans of various sorts. Commercial banks (except for the five large "Money Center" Banks) were small, and mostly limited to operations within only one state. Commercial banks engaged in a generally low risk, modestly profitable business in which they obtained funds relatively cheaply (through checking account deposits, and Reg Q.-limited savings deposits), made some money playing the "float" (of several days duration), and made some more money from the interest charged on the commercial loans and mortgage products which they offered. Risk was low, risk appetite was extremely aversive, and compensations and profits were moderate. Across the street in the investment banking community risk was high, appetite for risk was large (with numerous mechanisms of a highly technical, quantitative nature in place to help manage risk), and compensation and profits (in most years) were large. If an investment bank made a mistake, and lost money they took their lumps, ate the loss, and moved on.
All this started to change as the economic and technological background to all this began to evolve. Two events stand out in my mind: NOW accounts, and cash machines. Now accounts (offered primarily by brokerage firms) represented a direct threat to the income streams of commercial banks because they could offer returns on deposited funds above that offered by the reg Q-limited commercial banks. In general there was pressure to "speed up" the flow of money throughout the economy. Interest offered on deposited funds had to be increased, and the "float" periods were narrowed; cash machines made deposited funds "more available" to depositors as well.
Commercial banks responded by asking the government to eliminate Reg Q., and allow them to enter into "brokerage" businesses. Meanwhile across the street at the investment banks, they were inventing new kinds of financial products to meet the needs of an ever more complex and international financial marketplace. One of these "new" products: mortgage-backed securities, helped to lower the interest rate charged on housing loans, and as a side effect did some social good in eliminating such pernicious practices as "redlining". However, they also over time in effect removed a traditional income stream from commercial banks; commercial banks became the originators and "servicers" of the loan (for which they received a fee), but in general they no longer received the interest paid as the loan was repaid. Somewhere along the way restrictions on interstate banking were reduced, and then effectively eliminated, and large transnational entities began to emerge.
This set the stage for the final result: large multinational banks engaged in both traditional commercial banking as well as the more risky brokerage and investment banking businesses. With the recent meltdown in the housing marketplace, it has become apparent that some of these institutions are literally "too big to fail" because they are intertwined in every facet of the financial landscape across essentially the whole world — such that a failure could possibly lead to a "domino effect" widespread financial disaster that would have dire consequences not only for our nation but many others as well. Many of these institutions have evolved from corporate cultures that have little understanding of the risks associated with the high profitability of some of their products — leading to an over reliance on them in contributing to the bottom line.
I think, therefore, that a return to some form of "Glass-Steagall" is going to be necessary. Certain activities are vital to the day in day out functioning of our economy, have risk profiles that are relatively low and manageable, and need to be supported as a matter of public policy to foster financial and political stability. On the other hand the riskier, high profit transactions are also vital in the new highly fluid, technologically advanced, "global" economy. We need to foster both kinds of activities and protect them as well as we are able. Separating high risk from low risk activities seems to me to be a good idea. In addition I think we need a mechanism somewhat akin to the FDIC which would include a fund to bail out institutions engaged in high-risk activities — especially institutions that are regarded as "too big to fail". Institutions in the "too big to fail" category should be charged an ongoing fee to ensure that this fund is adequately provisioned. In addition should an institution be a member of the "too big to fail" club they would be subject to additional "emergency" levies should the money available in the "too big to fail" fund be inadequate to cover a particular failure.
Our efforts here need to be "ex ante" rather than "ex post" — we need to get away from this notion of "punishment" for bad behavior and rather recognize that these activities are necessary, yet nevertheless require some exposure to risk, and that there will be from time to time failures that will need to be accommodated. Further, we need to recognize that the analysis and management of risk, and the willingness to undertake risky transactions are highly complex and technical activities that require higher than normal compensations. The notions that certain "bonuses" are out of line need to be set against the highly skilled and risky nature of the transactions these people engage in.
Obama’s proposal appears to me in general to be headed in the right direction; at least Paul Volcker understands the environment and the relevant history.
As some of you already know, about a month ago I bought myself a Kindle DX. I have subsequently stopped reading printed material (except perhaps for rereading some of the 800 printed mystery and sci-fi titles in my personal library). I really like the ability to purchase a title in the Kindle store and begin to read it in a matter of minutes — no more trips to the bookstore. (I will have much more to say about my Kindle experiences in another post.)
I will confine myself here to titles that I have read since I acquired the Kindle. I started with a science-fiction story: the last volume of a tetralogy, the first three of which I had read in printed form. The transition from the printed-word to the Kindle-word was relatively seamless. After reading that last book: "Strength and Honor", I went on to read "Stormbreaker" the first book in the Alex Rider children’s series.
I often find juveniles to be entertaining — I, like many others, really enjoyed the Harry Potter books, and there are several other juvenile series that I have enjoyed. From this I moved on to "Point Break", the second in the Alex Rider series. Both were enjoyable enough (and certainly the pacing is quite breathless) but the characterizations and their ongoing development was fairly simple — so I probably will not continue with the series.
I should point out that when I read for pleasure, the stories and the plot are less important to me than the characterizations of the principals, and their evolution throughout the life of the series.
Next up was a Heinlein juvenile that I had read before: "Space Cadet", which was enjoyable, and I will probably read other of the Heinlein titles available in the Kindle store (since Heinlein is long dead, these are quite reasonably priced).I have never found a Heinlein story not to be enjoyable (although I prefer the earlier titles to the later).
I then turned to more current fare: "Pursuit of Honor" by Vince Flynn, a series I have been reading for about a year or so. This author is much praised by Glenn Beck (I won’t, however, hold that against Mr. Flynn). I then bought a book that Amazon was giving away for free, and it was okay — but you get what you pay for.
When browsing the Kindle store from the Kindle device itself, it is quite easy to click on the "buy" button, and, if you’re not careful (especially when the Kindle store is responding a bit slowly), to hit the "confirm purchase" button as well. This happened to me, and I ended up as the proud owner of: "Vintage Cheever" a collection of writings by John Cheever. I was initially quite annoyed because this was not a genre (I thought) of fiction for which I had any fondness. After a bit I realized that I had been confusing John Cheever with John Updike (Cheever has been long dead; Updike died only recently). Anyhow I decided to take a look at what I had so inadvertently acquired: mostly a collection of short stories and novel fragments — apparently Cheever was noted more for his short stories than anything else. Scanning down the table of contents my eye snagged on one title: "The Swimmer". Not too long ago I had been sort of watching a movie of the same name starring Burt Lancaster about a man at a party in the suburbs of New York City who decides to swim home swimming pool by swimming pool — a distance of about 7 miles. I was (as is my wont) switching back and forth from this to another movie to avoid commercial interruptions, and so I came to the end of "The Swimmer" a little bit confused as to what had happened. So I figured ah hah! Here’s a chance to read the story behind the movie and learn what was behind it all — after all the book is always longer and more detailed than the movie. Boy was I wrong. "The Swimmer" is a short story of perhaps 10 pages in length very spare, and to my mind a metaphor of a man passing through his life from vigorous youth to somewhat decrepit old age; whereas the movie comes across as a story of a deluded man who thinks he is a well-to-do suburban bon vivant, who was ultimately forced to face the reality that he is a somewhat deranged bankrupt. The movie, which it turns out was made by the producers of "David and Lisa" has stuck with me, and while a small thing, I would nevertheless not hesitate to recommend it should it again be shown on cable.
Back to browsing the Kindle store, where I made a marvelous discovery: many complete works out of copyright, in the public domain were available for a dollar or two. I was thus able to acquire 33 Wodehouse novels, all of the Jeeves stories, and "Piccadilly Jim" for less than five dollars. Wodehouse never ceases to provide chuckles and much merriment. If one were ever to create a top 10 list of the funniest novels in the English language, number one on the list would probably be: "Three Men in a Boat" by Jerome K. Jerome (written in the latter part of the 19th century). Evelyn Waugh would perhaps scrape on in position 10 with: "Decline and Fall" — all of the remaining eight positions on the list would have been written by PG Wodehouse.
Lastly, we come to the original Tom Swift series (which I have mentioned in a previous post), and the whole of which can be bought in the Kindle store for the ridiculous sum of one dollar. I have just finished rereading "Tom Swift and His Motorcycle", which is interesting less for the story, more for the background it paints of what everyday life was like circa 1910. The state of technology, and its prevalence within the society depicted is interesting — I will probablywrite a post about technology as a backdrop to life in the 20th century at some later time. One minor warning: much of the dialogue and attitudes expressed in this first Tom Swift outing are far from what today would be regarded as "politically correct". One other note: the action takes place in and around the fictional town of "Shopton" in upstate New York. Shopton is on the shores of "Lake Carlopa", believed by many to actually be Lake George. I, with some of my brothers and sisters grew up for a time in Ticonderoga,NY — which is also on the shores of Lake George.
My first recollection of reading anything comes from the second grade where I remember the teacher was having us read passages from whatever Dick and Jane reader was part of the educational orthodoxy of upstate New York in 1952. I remember these readers as being pretty lame, and none of it was of any difficulty for me, but was not so easy for many of the other members of the class. I remember being so bored by it all that I would fain difficulty with passages just to make the session more lively, and get some attention from the teacher.
At about the same time I was also reading comic books of a particularly grisly sort — the kind with dripping zombies climbing out of the mud of their graves, or skeletons walking about to terrify all and sundry. I remember this because my mother attempted to dissuade me from buying these as they would make me afraid, and "I’d be sorry". She was of course right, and after a few nights of disquieting nightmares, I stuck to more mundane matter — mostly Scrooge Mc duck.
About a year later I was spending a Saturday afternoon with my grandmother Boyhan, and she, finding me too much underfoot said, "why don’t you read this — your father read it when he was a little boy". She then handed me a copy of "Tom Swift and His Motorcycle" which was written in 1910 (my father was born in 1922). Anyhow, this was the first real bound book that I ever read from cover to cover. From time to time thereafter I would read other titles from the original Tom Swift series, but I found them horribly dated (a motorcycle that can go "a mile a minute!" seemed quite tame even by the standards of 1952). By 1953 I was spending all of my weekly allowance (a princely sum of one dollar) on titles from the Tom Swift Junior series. I read these through about the 15th volume by which time I had moved on to meatier stuff.
One afternoon I was at my grandmother Thomas’s house with my mother with nothing to do, and I asked her for a reading suggestion (she was an avid reader of mysteries and particularly liked Perry Mason and The Saint). She suggested I read "Fer de Lance" by Rex Stout — "I really like all the arguments between Nero Wolfe and Inspector Cramer, as well as all the beer he drinks". Thus began my lifelong love affair with one of my two favorite authors (the other being PG Wodehouse).
Most of my reading for pleasure since then has been of the mystery, or science fiction variety (with more lately a soupcon of thrillers added to the mix).
In order to avoid this unhappy circumstance I have decided to do a series of "level setting" posts on the variety of topics upon which I wish to discourse. These posts will be placed under the category of "level-setting". Henceforth, whenever you see this category, you will know that this is just me attempting to provide some background information about a subject that I intend to post on at some future date.
Hopefully, this will enable me to write shorter more pithy posts. I will attempt to provide a link to whatever level setting posts seem appropriate before I get too far in to the res.
One of the things I’ve hoped to do with this blog is share with you all the little useful Web 2.0 applications, browser buttons, and what have you — that have changed, and are in the process of changing the ways that I (and I suspect many of you) improve our day to day lives with their assistance.
I have a long list of these, but the first that I would like to share with you is an application called "readability". I came across this in a column written by David Pogue of the New York Times (link is here:). This little application (it’s a button really) strips out all of the advertisements and non-germane clutter from web articles and formats them in a pleasing reading format (that you can choose at installation time) for a variety of devices, font styles and sizes, and surrounding white-space.
To install "readability" in Firefox just drag the white button from the readability homepage to your bookmarks toolbar. To install "readability" in Internet Explorer, it is a little more complicated: on the readability homepage, right-click the white button and select "add to favorites", then go to your favorites, find the readability favorite just added, right-click it and select "add to favorites bar" — then anytime you click on the "readability" button on the favorites bar, the readability action will be performed on the currently displayed webpage. Needless to say in both browsers you will need to have the bookmarks (or favorites) toolbar enabled to get maximum convenience from this feature.
The reformatted page is very clean and readable with buttons in the upper left-hand corner allowing you to: reload the page in its original format, print the page in its reformatted layout, or e-mail the reformatted page to the favored correspondent of your choice.
I have mentioned in these postings and elsewhere that I dislike reading from screens; and that I tend to print out most of what I eventually read. This has presented a problem in the past because most modern WebPages are not in and of themselves printable (if you try and print them you get some pretty gosh awful outputs). Some WebPages have buttons to reformat them in a printable form — increasingly, however, webpage authors are not providing these buttons (perhaps for copyright reasons?). I have tended to get around this by manually selecting the parts of the page I wish to read, and then printing these selected parts (not always successfully), or copying the selections to OneNote where (with some light manual editing) a pleasing printable page can be created. Needless to say this can be a labor-intensive process. Lately, with my acquisition of a Kindle DX (more on this in a coming post) the ability to format webpages for delivery to the Kindle and eventual reading has made this capability even more critical.
Consequently, "readability" comes along at a very opportune time (one of the device targets it can format for is the Kindle).
In any event with or without my idiosyncratic reading necessities, the ability to get rid of all of the advertisements and other non-germane clutter should be a big plus for anyone wishing to read a Web-hosted article.
For those of you wishing to bypass the New York Times article and go straight to the readability page, the link is here.
After using this for a while I want to point out that there are some pages that "readability" just cannot process correctly — this is because the add-on misidentifies the parts of the page are important. The output is still nice and clean, it just doesn’t contain the information of interest. Still for most pages this is a great add-on.
The following editorial in the Wall Street Journal: Climategate: Science is Dying prompted this comment/response to them from me:
The sheer arrogant hubris of it all!!
Problems with the data:
In the late 60s and early 70s I worked alongside some scientists developing early climate models. The problem back then (as it is today) is the lack of good finely detailed temperature and wind data from all the points (and altitudes) over the whole surface of the earth. We only have any data at all going back maybe 150 years (and as we have recently learned even there the raw data has been irretrievably lost). We only have "good" data perhaps going back 80 or 90 years; and even that data is fairly sparse on the ground — located mostly at major concentrations of human habitation. It is only in the last 30 years or so with the rise of satellite technology that we have reasonably good data for points over the whole of the earth. Even then the graininess of the data is still unacceptably large.
Problems with the input parameters:
The Earth is an extremely complex system — no model can hope to encompass every possible variation and situation. Consequently it is quite normal in these models to adopt various simplifying assumptions that in some way encapsulate our best understanding of the underlying processes and science. Stable or "good" (useful?)models do not vary much in their predicted outcomes when small changes in these input parameters are made. Models with large, wide variations in outcomes upon small changes to the input parameters are deemed to be "chaotic". Climate models are mostly of the latter "chaotic" form. Need a "hockey stick"? I can give you a parameter that will return you that output. Want a 1° average rise in temperature over the next century — there’s a set of parameters for that too. Or suppose (mirabile dictu) you actually desire a temperature decline — why there’s a set of parameters for that as well!
Problems with the timescales:
The Earth as a scientific system operates at energy levels and timescales far beyond those of human lifetimes. We are as mayflies in the geologic scheme of mother Earth. Planet studies of our sisters Venus and Mars have shown us just how unique the Earth biosphere is. Venus could have been like the Earth, but isn’t. Climate scientists frequently like to point out that Venus is in our future, if we don’t clean up our carbon-emitting act. The fact is that in many ways the biosphere of the earth has been remarkably stable for a very long period of time (easily hundreds of millions of years — perhaps even billions). Over this very long time, the Earth has been subjected to a variety of large disturbances and shocks, yet the biosphere is still with us. Clearly there are stabilizers in the Earth system — operating on timescales of thousands and millions of years that keep things ticking over nicely. The thought that we could in our puny way somehow alter these processes (when as yet we have almost no good scientific understanding of them) is arrant nonsense.
At the end of the day what the scientists are engaged in is a statistical exercise. They take a certain amount of data, and from that attempt to project what that data might look like in the future. This is not science as we were taught it in school (in fairness given the timescales there is really nothing else that scientists can do); there are no hypotheses followed by experiments under rigidly controlled conditions with a looksee at the end of some results. I have mentioned before the problems with the data that we have. The "good" baseline data is probably only 30 to 40 years long (and even that is pretty sparse on a global scale). This baseline is good enough to do weather prediction, but not nearly long enough to do any kind of longer-term prediction of the sort that climate scientists are using as the basis of all their alarums and warnings. As I stated before: the Earth is an extremely complicated system operating at energies and time scales far beyond those of mere mortal men (sorry Superman). For us to then therefore turn around and say we can drastically modify how we live, and dramatically lower our standards of living, and thereby have some material impact on a climatological outcome is arrogant folly of the first order!
Scientists should stick to science and keep their noses out of public policy (politicians already mess this up quite nicely thank you very much)– too much of what has already transpired (as others here in the Wall Street Journal have pointed out) has been driven by a money trail whose corrosive effects are even now only dimly perceived.
Much of the
discussion here has been in the form of ad hominem attacks against Mr.
Henninger, scientists in general, readers of the Wall Street Journal, and
certain individual posters to this thread. This is not at all helpful.
I don’t know
whether the climate scientists are correct or not. I wouldn’t even begin to
venture a guess as to what mean global temperatures might be 100 years hence. We
do, however, seem to be losing the main thrust of the editorial, which is that
climate scientists (and their policy allies) are arguing for a "precautionary"
action agenda. To me this is a little bit like chicken Little running off
shouting "the sky is falling" as the first nugget of hail bounces off his
We need, I think
instead, to adopt the principles used by doctors, and those engaged in medical
research (the Earth is after all our very dear, and only home): that principle
is best expressed in the Hippocratic oath by the words "first do no harm".
and many policy wonks are advocating radical restructurings of just about every
facet of human existence. As several have stated here: thousands of climate
scientists all have come to similar conclusions, and they can’t all be wrong.
Well, thousands of scientists thought that the Ptolemaic view of the solar
system was the correct one and that Copernicus was wrong (and in fact the
Ptolemaic view produced better planetary predictions for many years after –
until Kepler came along). Not to demean the scientists, but I find arguments
based on the "authority" of thousands of scientists not very compelling.
Luckily, most of
us live in democracies, and I doubt that the precautionary principle is going to
get much real traction. Take the behavior of most of the signatories to the
Kyoto protocol and the emissions levels they agreed to achieve: not one of them
even came close to hitting their targets; and most European nations didn’t even
try (from a political standpoint) to put in place anything that would move them
towards their climate goals. In actual fact, the United States, which was not a
signatory, came closer to hitting their projected targets (even though they
missed them by a wide margin) than their European cousins. Politically Kyoto,
Copenhagen, and the endless stream of climate change conferences to come will
achieve nothing meaningful (other than increased funding streams for our
friendly climate scientists — which is not necessarily bad — new knowledge is
never a bad thing; and if the scientists are right, understandings grounded in
better science will lead to better policy options should they be needed sometime
in the future).
If there is a real
problem here, it is one of pollution: the pollution of too many people; but you
will find few willing to address "that" 500 pound gorilla sitting at the table.
Another point which I find consistently overlooked is the fact that, even if
global temperatures were to rise on average, the actual effects on individual
areas over the whole of the Earth would vary widely (I have even heard
projections from climate scientists saying that if global temperature rises
overall, it will nevertheless result in a dramatic cooling in many areas — one
notable example of which would be Great Britain).
I am not
advocating a do-nothing strategy here (in fact I think many "green" initiatives
will have beneficial outcomes — certainly moving away from fossil fuels is
something we ought to be doing whether the temperature is rising or not). On the
other hand I think given the timescales, the enormous complexities, the
"apparent" uncertainties, and mostly because much of this is based on computer
simulations, we ought to be a little bit humble, a lot cautious, and as I said
above: "first do no harm".
A recent article in the Washington Post: (Hydrogen-powered car still seems improbable – Washington post.com ) prompted me to think some more about energy policy in general, and hydrogen fuel cell technology in particular.
Discussions about technology (especially technology policy) are hampered by two different and conflicting (but both very necessary) viewpoints. I speak of the architect on the one hand, and the engineer on the other. The referenced article seems to come at the issue of hydrogen powered vehicles from a mostly engineering perspective. I on the other hand am more of an architect in my outlook (I have never met an inconvenient fact that I could not gleefully ignore in the pursuit of an elegant design ). Engineers from my point of view (not surprisingly) are often too pessimistic, while architects are too optimistic (well at least I’m happy when my project crashes around my ears).
My specific response to the article (which is posted — along with others — at http://www.washingtonpost.com/wp-dyn/content/article/2009/11/16/AR2009111602668_Comments.html> ) is below:
This good article asks the two most important questions:
1. Where does the hydrogen come from? What does it cost to produce? What are the direct costs, and the indirect costs (pollution, greenhouse gas emissions, safety, etc.)?
2. After we’ve produced the hydrogen how do we deliver it to where it needs to go (the supply chain question)?
To these questions I would add two subsidiary ones:
3. Where should the hydrogen producing facilities be located?
4. What is the best place in an energy infrastructure to utilize fuel cells?
Living in Florida as I do, and being without electric power for 2 to 3 weeks after every hurricane (and also being a child of the 60s ) I have become quite enamored of decentralized infrastructure strategies.
It doesn’t necessarily follow that a hydrogen economy means that cars are powered by hydrogen. It is quite possible (and perhaps even desirable) to locate fuel cells on a concrete pad at the back of every house. This would provide all of the electrical needs for the household and could also be used for overnight charging of an all electric vehicle. Hydrogen could be delivered to the house in much the same way as propane is delivered to many houses in the Northeast for seasonal heating.
Hydrogen could be produced centrally (as it appears the author of the article assumes), but it could also be produced locally — perhaps every existing gas station could be converted to a hydrogen producing facility using electrolysis or some other method.
I would like to make two other observations: there is an electric car start up (I believe it is out in California) whose concept is that rather than charging a vehicle overnight at home (or at a charging station), one instead drives up to a battery exchange station takes the old depleted battery packs out of your car, gives them to the station, and pays for new fully charged packs and off you go in a time-frame about the same as it would take to buy a tank of gas today.
I would also point interested readers at a 2002 Scientific American article about the General Motors AUTOnomy concept car which highlights how changes in a drive train (i.e. a fuel cell) can radically change how one might design the final automobile: Designing Autonomy.
I guess my point is that if we are to consider a hydrogen infrastructure we must do this in conjunction with a rethink of how we generate and deliver energy more generally within our economy.
In conclusion I don’t disagree that a "hydrogen" economy is probably a ways off, but on the other hand in the light of trillion dollar bail outs to all and sundry, I don’t find that a couple of hundred million in research funds for alternative energy technologies is all that out of line. At this point we ought to be investing in as many different energy producing technologies as we can (many of these would only require a few million dollars per year).
It is useful at smaller sizes to think of a fuel cell as a much better battery (fuel cells have been used as laptop battery packs in addition to powering cars), but fuel cells can be scaled up to kilowatt and megawatt ranges. I was reminded of this while watching yesterday’s shuttle launch when a few minutes before takeoff the flight controller asked the crew to switch all shuttle power to the on-board fuel cells. Clearly the fuel cell has been around for many years now, and has proved reliable in a variety of applications. You can buy a fuel cell for your house instead of a generator, and these have been installed in a number of locations around the country (apparently a company in California will even sell you one that runs on sewer or landfill gas i.e. methane).
Fuel cells are not a panacea, there are problems, as mentioned in the article, but they are, I believe, solvable with some modest research.
I recently had the opportunity to read the “highlights” of the Department of Energy’s fiscal 2010 budget (it is 102 pages long, and that’s only the executive summary…sigh). The document is a useful starting point, but it bears almost no reality to what we have or will be spending on energy-related research. Congress completely redid the budget to reflect its own priorities (and insert a lot of juicy earmarks). Also there were significant supplementals in the 2009 and 2010 spending coming from stimulus funding. It is consequently virtually impossible to figure out what we are really spending on research. Another complication is the fact that many similar sounding items are carried in the budget in widely different areas. Some things that are actually energy research are carried in the weapons development part of the budget (presumably because military-related items are scrutinized differently). Other energy research items are carried directly in the Department of Defense budget.
It does appear that the administration attempted to “refocus” (read reduce) fuel cell research, but congress restored a lot of it.
In the aggregate I would guess we have spent tens of billions of dollars on energy related research since the end of World War II. There have been several studies highlighting the fact, however, that in real terms research expenditures in energy production have declined markedly – both in the public and private sectors. This should not be all that surprising considering how little the energy generation fundamentals have changed as a result of all that research spending. However, given that energy underlies virtually every facet of our society, and fundamentally contributes to our standard of living this is nevertheless somewhat disturbing.
Given the current political and economic realities (whether you agree with them or not), there are several areas that are getting attention: carbon sequestration, nuclear energy (mostly fusion), fuel cells, solar, and energy distribution infrastructures to name just a few. The DOE budget appears to give some funding to all of these. But superficial reading can be misleading.
Let’s look at a couple of examples: carbon sequestration and energy infrastructure expenditures.
Carbon sequestration is all about the taking of greenhouse gas emissions and burying them somewhere out of sight and out of mind or converting them into something more benign. Much of the focus here is on coal-fired plant emissions, and on agricultural waste — what’s left on the fields after farmers have done their harvest. We tend to forget that a considerable source of greenhouse gas emissions are not in fact the automobile and coal-fired generating plants (although these both do contribute a significant fraction) but agricultural waste and livestock gas emissions (including those from humans). Just the fact that DOE is providing funding for this kind of sequestration approach (as opposed to trying to reduce emissions directly) is I believe a step forward.
There is also a considerable amount of funding in the budget for what has been has come to be called “smart-grid” technology. This mostly has to do with modernizing the electricity distribution grid and making it more resilient in the face of generation outages. I would, however, prefer that we think a bit more about energy distribution in general to include not just electricity distribution, but also fossil fuel, and hydrogen distribution. I would like us to rethink what energy needs to be centrally generated, and what can be done locally. It is my belief that a distribution architecture based on a highly localized energy generation strategy would be more flexible and resilient than the current centralized distribution grids that we currently employ.
Given current concerns with global warming many have suggested that we move nuclear power generation back to the front burner. I was initially of this opinion, but upon reflection I don’t think we ought to be building any more fission plants. With Yucca Mountain no longer on the horizon, I believe we would be prudent to run the plants we already have until they reach their natural end-of-life status, and then decommission them (itself a lengthy and costly procedure – Maine Yankee for example took 7 years and $500 million to take down).
On the other hand we should pursue research on fusion power technologies very aggressively. I don’t want to go into nuclear technologies in any great detail here (I plan to have a subsequent post on this topic), but, if you read the DOE budget you might think that the only approach being pursued is the ITER international consortium. However, hidden away in the weapons part of the budget is funding for ICF (at a level higher than that for ITER). ICF is construed as a weapons stewardship program — that is ICF can be used in lieu of setting off nuclear explosions to answer weapons oriented questions. While ICF can certainly be used to do this, it is in reality (wink wink nudge nudge) a fusion power generating technology with (many think) better potential than that offered by ITER. Clearly ITER will not lead to any kind of commercial power generation capability before the 2050s (according to their own latest projections). Being a bit cynical (and reading some of the ITER and ICF project justifications) it is apparent that these large-scale projects are more about preserving "rice bowls" then about producing practical results.
There are four or five other fusion projects (most very small — both in size and dollar expenditures) funded by a variety of private and public sources. Perhaps the most visible (based on web interest) of these is the Polywell project funded by the U.S. Navy. This project has been ongoing since the early 90s. Most recently it had been funded at about $2 million per year. A major milestone and project review was recently passed, and the Navy has increased funding fivefold going forward. I will talk about this, and the other “little” nuclear technologies in more detail in the forthcoming post alluded to above. It is these kinds of small scale efforts that I believe need more robust funding – right now we’ve got all our eggs in two very big, costly baskets. A small scale fusion success would be radically trans-formative.
After I posted this, I came across the links to two Scientific American articles that informed my thoughts on nuclear waste handling, and carbon sequestration. I add them here for any who might be interested: