[Tfug] Cheap Memory = Lardy Men = UofA Comp Sci Program

Bexley Hall bexley401 at yahoo.com
Sat Dec 22 15:05:01 MST 2007


--- "Bowie J. Poag" <bpoag at comcast.net> wrote:

> I would agree... When I attended U of A about 7-8
> years ago, their Comp 
> Sci program was kind of lame. I felt at the time
> that it didn't go very 
> far toward preparing students for real-world coding.

I think this is true of many universities.  Some
"teach for today" (i.e. skills that can be used
*immediately* upon graduation -- that are already 2
years obsolete  :< ) and others "teach for tomorrow"
(skills that tell you HOW TO LEARN and then rely on
*you* to learn what you must for that first employer)

I think it is not just a CS phenomenon.  I think the
same is true of many disciplines.  (I've heard this
from past employers)

> In retrospect, I 
> was right.. They didn't prepare us for real-world
> coding, nor did they 
> emphasize good coding habits, or even begin to teach
> topics such as 
> what's good coding philosophy, what open source is,
> everything that 
> everyone knew was going to be commonplace in the

When I was in school, "open source" hadn't been
"invented" yet (I think Stallman and I went to school
together but I can't recall exact dates).  For that
matter, there was no "commercial software" besides
"business software" running on Blue Iron and 11's,
etc.

But, they exposed us to lots of different concepts.
(My favorite course was "Introduction to Algorithms"
which was a fantastic understatement -- as it touched
on many clever algorithms for which it took years for
me to find suitable applications).  It highlighted
cost analysis of various approaches so you were
focused on that aspect of some "abstract" problem
instead of the problem itself.

It (my education) was also good at exposing us to
broad-brush concepts and loft-goals that have
validated themselves over the years.  E.g., I
recall a language design class where one of the
stated goals of any "good" computer language was
to be able to write a program on a single sheet
of paper (try doing that with Java!).  At the time,
it sounded incredibly arbitrary.  But, as someone
who has had to maintain pieces of poorly written
code that filled *notebooks*, I now appreciate the
value of that goal!

> future. They touched on 
> none of it. It was more like "Oh, it runs? Great! A+
> for you. Next?"  I 
> would have preferred it be something more along the
> lines of "Oh, it 
> works? Great. You get a D+. Now go make your RPN
> calculator applet 
> lightweight, so it doesn't require 39MB to run on
> this machine. If you 
> can do it in a memory footprint of less than (x) and
> still keep it 
> readable, you get an A."

[Java]

> Simple procedural programming is still the dominant
> method of getting 

Agreed.  Though I use many object-oriented concepts
in how I *design* my applications.  I just don't
buy into the OO *languages* and their mechanisms.

> stuff done, not OO.. For most things, OO is exactly
> what it appears to be. Unnecessary, time-consuming,
> slow, bloaty overkill. My time as a student would
> have been better spent learning how to be a better 
> procedural coder, and THEN open the door to OO
> concepts later. I mean, what good is it learn how
> to carry around an OO-centric 300 pound Swiss 
> Army Knife with 117 different attachments (of which
> you'll only use maybe 2), when all you really need
> is common sense? 

Better example:  how good is that Swiss Army Knife
when you are being attacked by a *BEAR*??!  :>
Sometimes, a crude, blunt *rock* is more effective!

> Ditch the knife and 
> use your head, for cryin' out loud. Don't teach me
> WHAT to think--Teach 
> me HOW to think. How do I make my code fast AND
> readable? How do I 
> collaborate with others?  How can I be more
> strategic in my approach to 
> problem solving? What does it mean to write truly
> portable, streamlined, 
> orthogonal code? How do I maintain my code over it's
> lifespan? I got none of that at UofA, sadly.

I think you will find that lifespans of commercial
codebases are very short, nowadays.  It seems like
lots of code just "serves its purpose" and then
"disappears" -- only to be reinvented some time
later (with all the same bugs, etc.)

> Ugh... and to make matters still worse, UofA
> relegated assembly language 
> to the CE track, and away from CS, which in my book,

"CE"?  (CS == Computer Science)

> is criminal. Sad, 
> and criminal. Worse, they tried to line up SML/NJ in
> it's place, like it 
> taught the same concepts.. good lord..

In my case, "CS" was one of three "EE" degrees
(i.e., my diploma says "EE").  All three EE curricula
had a core set of coursework.  So, if you opted for
the "pure" EE degree, you still learned how to write
a compiler and how to implement multiple precision
floating point, etc.  Likewise, the "CS" option
still taught you how to design a computer and how
to bias an amplifier, etc.

I think this is probably a key part of my approach
to designs.  I see things bottom-up, for the most
part.  I.e., I know what my code is running on
(usually because I *designed* the hardware!) and
how much everything costs (because I know there
are only X bytes of ROM on the board and Y bytes
of RAM!) because I *must* know.

I think it also explains why I have never had any
"problem" dealing with "pointers" (since I've had
to wire the "address lines" to the memory and
implicitly think of everything as existing *at*
a memory address) or "concurrency" (since you
design hardware to do several things in parallel,
why should software be any different?).

Maybe its time to bring back the "trainers" (dumb
little microcontroller boards with pushbuttons,
lights and hex keypad so you could key in your
code and watch it execute).

<shrug>

--don


      ____________________________________________________________________________________
Looking for last minute shopping deals?  
Find them fast with Yahoo! Search.  http://tools.search.yahoo.com/newsearch/category.php?category=shopping




More information about the tfug mailing list