(Confession: I had to be at another meeting last night and missed the seminar. Any reader who wants to fill in the juicy details in the comments would be appreciated.)
The Advertiser does a fair job of reporting on the event itself. What is missing is the background that would let a reader know what grid computing is, why grid computing is important and why it is being discussed in Lafayette just now.
What grid computing is:
Wikipedia offers a technically oriented overview that's pretty complete if you want the whole nuanced story.
But for the rest of us: Grid computing is a way to make massively effective supercomputers out the leftovers of everyone's desktop computer. We look at our computers and see amazingly effective, plastic machines that can do an astonishing range of things. Network managers and uber tech types look at them and think: WASTE. All that processing power going to waste. Most of the time nobody is using any cycles and even when they are working 90% of the cycles are still going to waste. For the tidy-minded this is just silly. For those with real, unfilled compute-intensive needs it is offensive.
Why Grid Computing is Important:
Well waste is bad. (Your grandparents told you so and they were right.) But beyond that there really are problems that are either too big to run in real-time on even the most expensive supercomptuer and big, real computational problems that nobody can afford to purchase time to solve. Those are the sorts of problems for which Grid Computing is most likely to be implemented to solve.
One of the virtues of grid computing is that the resulting supercomputer is super cheap. Cheap is good. Suddenly the little guy can do things that only those that could own "big iron" could do before. You've heard of render farms? Expensive. Exclusive. Anyone with cheap access to a muni-sized grid system could compete cheaply George Lucas. The most powerful supercomputers are built to model the weather--globally. We could use the same powerful methods to apply to, say Lafayette parish, and begin to get a handle on truly local weather. (Those summer thunderstorms in Louisiana that drench one field and leave another 300 yards away bone dry? We could understand that.) Trouble is, the finer-grained analysis uses almost as much compute power as the continental-level projections. We trouble ourselves to afford large scale predictions whose accuracy--say about hurricanes--would have astonished the best metereologist in our parent's day. But we can't afford to do the same locally. Grid computing could change that. (The list goes on...)
Why is Lafayette talking about this:
The big hangup with large-scale grid computing is bandwidth. There's never enough of it. People don't want to give it up and private providers, who profit off maximizing the difference between the bandwidth you buy and that which you actually use, don't want to do anything to encourage a new, big drain on their resources that they don't get paid for. All that is why grid computing is rare and why most small models are on public networks like those at universities who view local network usasge as something to maximize. (They've paid for it an want to use it fully--short of the point of congestion, of course.)
When Lafayette gets it big bandwidth 100 megs internal fiber-optic system from LUS most of us will have that bandwidth to burn. It won't cost us anything noticeable to share our cycles and bandwidth with others. Maybe those who chose to do so could get a small rebate on their bill and/or cheap or free access to the computational power that the community has provided itself. The 100 megs would make the grid plenty quick enough for distributed computation. Everyone benifits and a major, cost-saving, unique, and radiacally disruptive tool is added to the resources of Lafayette residents and businesses.
Uniquely cheap services are the kind of thing that businesses travel to a locales to take advantage of--the food and music here would be a nice plus.
Worth thinking about, no?