Home
| Calendar
| Mail Lists
| List Archives
| Desktop SIG
| Hardware Hacking SIG
Wiki | Flickr | PicasaWeb | Video | Maps & Directions | Installfests | Keysignings Linux Cafe | Meeting Notes | Linux Links | Bling | About BLU |
From: Brendan <mailinglist at endosquid.com> Date: Thu, 29 Sep 2005 11:16:26 -0400 On Thursday 29 September 2005 11:17 am, Anthony Gabrielson wrote: > On Thu, 29 Sep 2005, Ben Jackson wrote: > For $100 you can have the laptop, and then with the pdf version of a > text book for $25ish and always have the most up to date version - thats > great. Seems like a good way to go to me. Do students these days really > write past the fifth grade? In a day where I have read that many > universities consider undergrad libraries obsolete, this seems a logical > progression. Yes, logical if you want the dumbest kids in the world. Why are American kids getting dumber? Instead of technology being pushed into every crevice, we need some research that conclusively says where it's needed, and where an actual book is better. Mostly, this is going to be used for IM to other kids. Of course, some of them are going to learn more with a laptop, and I guess we'll just depend on those kids making it to adulthood to run the country. The book monopoly/lobby/scam-artists are never, ever going to allow a 25 dollar, self-updating PDF file. Forget that dream, when they are making 100-150 per book, per student, per class. No way are they going to ditch *that* revenue-stream. Remember, all the great achievements in nearly every subject have all been achieved with actual books teaching actual students. Kepler didn't have the newest Toshiba. Galileo didn't IM his buddies to tell them about the bowling ball experiment and Copernicus didn't leave a .doc attachment saying he wanted to posthumously publish his works. For that matter, they didn't have textbook cartels either back then. I don't believe that there's something special per se about pulpware (or ragware, for connoiseurs) vs. bits. Both have their advantages and disadvantages in context. It's a lot easier to read a book on the toilet, but it's a lot easier to search on a computer. For archival purposes there are good and bad points both ways. In a general sense, I don't believe that early use of technology is either good or bad per se -- again, there are advantages and disadvantages, and it's what you make of it. Photography has long been taught by requiring students to use completely manual cameras such as the Pentax K1000 (which has a very simple meter built in; I'm not even sure if it has aperture priority or if you have to transfer the shutter speed by hand). As an amateur photographer who usually gets very good results, I agree that it's very important to understand lighting, exposure, and composition, or you'll just take snapshots no matter what equipment you buy. I did always use autofocus/autoexposure cameras myself, but I also taught myself about exposure and got very nice results. However, when I first got a digital camera about 2 years ago, I found that the quality of my shooting went up rapidly, as much because of the technology (both its advantages and disadvantages) as anything else. Digital photography offers instant feedback; you see your results right away and never have to wonder later what exposure you actually used on a particular shot. It's also much more demanding of proper exposure than print film, and I think it's actually more demanding than slide film -- overexpose even slightly and the highlights completely blow out. Furthermore, digital photography is much cheaper per frame (factor in consumables such as disk space and flash batteries, and depreciation such as shutter lifetime, and it's maybe 5 cents/shot, about 1/10 that of film). This combination makes for a potent learning tool -- it's well established that quick and accurate feedback (immediate review and demanding exposure), and repetition (affordability), are essential learning tools. Even though I shoot far more frames now than I did with film -- last night I was at a charity fundraiser my wife was running and I shot 441 exposures, probably about double what I'd have shot with film -- I have fewer blown shots. What's more, each event I shoot I get better results, and I can make adjustments during the event, and get feedback. Again, that doesn't mean I can avoid the fundamentals. It's not mindless try and see what works best; in order to get even reasonable results I still need to understand what I'm doing and why (I almost never use program autoexposure; when I'm using flash, I normally use manual exposure to balance the ambient light with the flash in the way I want, but balancing flash against ambient light still means that I have to understand the effect I'm after and a general notion of the ratios needed). But I can try different ratios in the vicinity of what I want and see very quickly exactly what happens. The point of this isn't to simply glorify technology; it is simply a tool, and has to be used appropriately to achieve a desired end. I don't think it's appropriate to say "Galileo used books, so they're the right solution for children to learn from" either -- there simply were no other options (other than word of mouth and personal experiment, which shouldn't be discounted either) at that time.
BLU is a member of BostonUserGroups | |
We also thank MIT for the use of their facilities. |