College students have long played an integral role in the development and adoption of new technology. Students, along with businesspeople, comprised the bulk of the portable electric typewriter market in the 1960s and 1970s. In the mid-1970s, two students—Bill Gates and Steve Ballmer—met while living in the same hall at Harvard, and went on to play critical roles in the development of the personal computer in the 1980s and 1990s. Universities were among the first institutions to support the growth of the internet, and for a time provided high-speed internet access to more people than did corporations. In the late 1990s, a Northeastern University student named Shawn Fanning and his uncle developed Napster, one of the first popular peer-to-peer file sharing programs. Again at Harvard, Mark Zuckerberg and fellow computer science majors developed Facebook, which was initially only available to college students, but now is the second most-trafficked website, after Google. Google itself was born through the collaboration of two Stanford University graduate students, Larry Page and Sergey Brin.

Icons of file sharing, social media, and internet search: all hatched on college campuses

Today’s college students are universally expected to be computer-literate. Every college campus in America has computing centers with anywhere from a handful to hundreds of networked systems available for student use. Most campuses provide extensive wireless internet access to students. Technophile professors like my own graduate adviser at the University of Wisconsin-Madison, John Hawks, often communicate with students via blogs, Twitter, and even Facebook. Many assignments are expected to be submitted electronically, and professors increasingly incorporate novel forms of coursework and evaluation like videos uploaded to YouTube and Wikis produced by students. That is, it is impossible for today’s college student to be successful without extensive utilization of computing technology. Of course, millions of Americans who take online distance learning courses are entirely dependent upon access to a personal computer and the internet.

What kind of technology does a college student need to buy?

To be blunt, the answer is not much. Most colleges and universities provide more than sufficient access to technology, such that some students never buy a personal computer, let alone a printer, scanner, or other gadgets. I wouldn’t recommend this—it’s inconvenient and restricts your schedule. School-provided hardware is also sometimes aggravatingly outdated, and campus networks do not always work. But college is already incredibly expensive, and it’s hard to reduce your technology budget to less than zero dollars. You have to be very familiar with your school’s technology resources before attempting to get your degree without your own PC. This is a less-than-ideal solution, and spending some money on personal technology can make a student’s life much, much easier.

College is not just about learning Latin declensions, radioisotope decay chains, and great works of fiction. It’s also about learning how to live more or less independently. Our lives are steeped in technology, and college students are just like anyone else with a job—there is no one correct technology solution. The most basic computing solution for a college student entails one personal computer, be it a desktop or a laptop.

A desktop or a laptop?

In the context of college, desktops and laptops both have their advantages and disadvantages. Desktops almost always are more powerful for their cost, are easier to modify as needs change as well as repair, and are harder to steal or lose. Desktops also take up more space, and aren’t portable. A laptop's most notable advantage is portability—you can take it anywhere to get work done. They also occupy less volume, a major consideration for cramped dorm rooms. But they’re also a prime target for theft on campuses, and are more expensive considering their specifications.

Since the rise of netbooks and the ever-decreasing cost of desktops, I’ve come to think that asking whether to use a desktop or a laptop is asking the wrong question. Netbooks are frequently less than $300, with some as inexpensive as $200 (or even less on sale or clearance). A basic desktop can be built or bought for $500 or less, monitor included. Rather than deciding to buy a laptop or desktop, I think it’s wiser to ask yourself what your computing needs are. Most college students need to be able to browse the web and use office applications to type papers and make presentations. These tasks do not require the latest and greatest (and therefore most expensive) tech. If you do not need more than basic computing capabilities, I’ve found that having a less expensive netbook or budget laptop and a standard office computer is a far better solution than having one powerful laptop or potent desktop.

Another important consideration is how long you expect your computer(s) to last. It is perfectly reasonable to expect today’s budget gear to be able to passably browse the web and type papers for the next four years. It is not reasonable to expect today’s budget gear to be able to play 2015’s games and run Adobe Creative Suite 6 or 7 very well. It is difficult to predict what you’ll need for the next four years, but speaking with older students in your program and your professors can give you a good idea of what you’ll be doing as a senior. For those looking to buy a new PC—laptop or desktop—the next few pages cover DIY and off-the-shelf (retail) desktop computers and monitors as well as netbooks and laptops.

DIY Desktops


View All Comments

  • shellcrash - Saturday, August 6, 2011 - link

    There is also a common denominator issue going on: not everyone is expected to have a laptop. If one is required, the prof borrows one from the IT department.

    I graduated in 2006 & laptops were affordable, but they have durability issues that show up when in frequent transportation or use. In random work meetings having a desk & environment designed for work is much better than trying to do serious work in a laid back coffee shop environment; I stopped using my laptop when I started using my Palm (or was it lol TI-89?) & flash drive effectively. It also didn't help that the fans on the laptop died and some keys stopped working on the keyboard.

    The reviews didn't cover phones, although it is the 2nd most crucial instrument in college. Needs to be able to be backed up to computer and do fast assignment scheduling.
  • anishannayya - Monday, August 8, 2011 - link

    75% (really closer to 50%) percent of my classmates didn't have to spend a dime for their computers.

    50% of my classmates aren't CS/Engineering majors.

    The 20% of CS/Engineering majors that have a Mac inevitably end up having to install and use Windows.

    The only Mac worth the money is the Air. For everything else, your better off with a Thinkpad.

    Lastly, Macs look great, but durability is very important when you take your device everywhere.

    And RE desktops: Many gamers bring their desktop.
  • Neo Elemental - Monday, August 15, 2011 - link

    I had a desktop all the years I was in college. To some, having additional horsepower is a non-negotiable (think games). I did end up caving and getting a netbook in my senior year.

    I don't see how the desktop+netbook combo is any less feasible or attractive for someone at an equivalent price (except that not everyone can build their own desktop).

    This isn't the site for those who are just going to get a Macbook because they can afford it. A majority of the things mentioned/reviewed on this site are focused towards desktops and non-Apple systems.
  • mfenn - Thursday, August 4, 2011 - link

    You make a big deal about constrained space when you talk about the PC itself, but you seem to completely forget about it when you recommend dual monitors. A tower can easily be tucked under a bed or desk, a second 22" monitor cannot. Reply
  • Kaboose - Friday, August 5, 2011 - link

    Wall mount :D

  • Friendly0Fire - Thursday, August 4, 2011 - link

    "And given the limitations of the 400W PSU, I wouldn't add a GPU that's more power-hungry than a Radeon HD 6970."

    Surely you meant a 6790? The 6970 isn't exactly sipping power.
  • Gigantopithecus - Thursday, August 4, 2011 - link

    Nope, the 6970 is not exactly a power sipper, but at peak during gaming it draws less than 200W. The rest of the system at stock without more components added isn't going to draw more than another 120W or so, and that leaves 20%+ headroom on the PSU. The link to Bench I gave with that line shows the 6970 using 340W from the wall, and the Bench test platform is a more power-hungry system than the one in this guide. A friend of mine is rockin' a GTX 560 Ti (a slightly less power-hungry card than the 6970) with a mildly OC'd AMD PII X4 (a 125W CPU vs the 95W CPU in the guide), two fans, one HDD system on this same PSU with no issues. It's a good PSU. Reply
  • JarredWalton - Friday, August 5, 2011 - link

    Well, I've run a single 5870 off of a 500W PSU, and power draw at the outlet never got about 380W. Accounting for efficiency, a 400W PSU should still handle a 6970, but you wouldn't want to load it up with other extras or extreme overclocking. Reply
  • Gigantopithecus - Friday, August 5, 2011 - link

    After thinking about this and on the advice of Ryan, our GPU guru, I edited the article to recommend nothing more power-hungry than a 6870. That's a more comfortably conservative recommendation, and I think it's better to err on the side of caution. Reply
  • mariush - Saturday, August 6, 2011 - link

    That's indeed better.

    A lot of the pre-built computers still come with very cheap power supplies based on old designs with lots of amps on 3.3v and 5v and not so much on 12v, so those 220 watts could be a bit too much for these power supplies.

    Someone might get mixed up and think that any kind of 400w power supply would be capable of this, which is of course not true.

Log in

Don't have an account? Sign up now