I also ran 72 tests' worth of UDP measurements, leveraging the Ixia-supplied UDP_Throughput.scr script as-is in the four-stream case. For single-stream UDP tests, I modified UDP_Throughput. scr, increasing the per-stream data-payload size from 730 KBytes to 7.3 MBytes in order to lengthen the tests' runtimes. After all, as I previously mentioned, newer powerline technologies focused their performance-improvement attention on UDP, which finds use in streaming large-payload multimedia content from one network client to another. And you can find my UDP testing results reports in the earlier-mentioned ZIP archive.

But I've decided not to publish the average bandwidth measurements in a table, as I did one page back with the TCP results. That's because, in every testing case, only a few packets succeeded in making their way from one Endpoint to the other, translating to a near-100% packet loss scenario. To be clear, this underwhelming outcome did not occur due to any inherent HomePlug AV or IEEE 1901 UDP incompatibility; as I type these words, for example, I'm watching a video streamed from a computer to my television over an AR7400-implemented powerline spur and using UDP. But after a few moments' reflection, I came up with a seemingly plausible explanation, which a subsequent chat conversation with Brian Klug sanity-checked and an email back-and-forth with Ixia's Michael Githens confirmed.

The issue, as networking wizards among you out there may have already figured out, is that by definition, UDP is (in the words of Brian Klug) 'connectionless'. This means less protocol overhead, but with UDP being a best-effort approach, it's possible for an abundance of packets to get dropped due to transmission channel bandwidth limitations...that is, unless the transmitter is somehow (at the application layer) able to monitor the receiver's success-or-failure statistics and (if necessary) back off its packet-output rate. Windows Media Center, for example, does this; if necessary, it'll more aggressively compress video (for lower per-frame average bitrate, at the tradeoff of reduced image quality) and/or drop frames in order to match its transmission speed to the channel and receiver's capabilities.

IxChariot unfortunately doesn't seem to include destination-cognizance features such as these. All it seemingly knows is that, for example, the two HomePlug AV adapters I tested are reporting 100 Mbps PHY rate capabilities, with the XAV5001 adapters touting GbE speeds, and it subsequently sends out packets at much faster rates than the powerline spur is capable of reliably routing to their destination. Here's what Ixia's Michael Githens said when I explained my predicament and theory as to its root cause:

Unfortunately there isn't anything built into IxChariot to automatically help you figure out that no-drop rate. You could, however, script it to do a test, check results and change an input value to have it do multiple runs for you so that you can zero in on the no-drop rate. That would be the best I could suggest.

Such an iterative approach is actually something I'd considered, but its tedious and time-consuming aspects make it unattractive for the involved testing suite that I tackled in this project. And unfortunately, Brian Klug reports that iPerf is seemingly similarly destination-unaware in its consequently 'blind' UDP testing approach. I welcome reader suggestions on utilities or other techniques that will measure the peak no-packet-drop UDP bitrate without excessively burdening the source-to-destination channel (thereby under-representing the channel's UDP potential). Applications that are Mac OS X-cognizant are particularly welcome, since the only two GbE-capable laptops in my possession are Macs, neither of which has Windows optionally installed via Boot Camp. Please leave a comment on this writeup, or drop me an email.

For whatever it's worth, I'll wrap up this particular page by passing along the four router-to-node screenshots of 'raw' measured bandwidth potential that Netgear's Powerline Utility (Windows version; here's the Mac edition) measures for my AR7400-based XAV5001-plus-XAV5004 powerline network, from the router-based adapter to each of the four other network nodes.

Node 1

 

Node 2

 

Node 3

 

Node 4

TCP Testing Results Conclusions
Comments Locked

53 Comments

View All Comments

  • quiksilvr - Thursday, September 1, 2011 - link

    Have you tried getting a better router and/or perhaps a better wireless card for your laptop?
  • akedia - Thursday, September 1, 2011 - link

    I have a current generation Airport Extreme, which is generally regarded as one of the best wireless routers available, and the built-in WiFi antenna in my Mac mini is not upgradable, as far as I know. My roommate's laptop is an HP dm1z, also not upgradable, and my Droid X is stuck with the antenna it shipped with as well. It's not my hardware, it's my environment. WiFi has limitations, like it or not.
  • bdipert - Thursday, September 1, 2011 - link

    Different tools for different tasks, jigglywiggly. Powerline can make a pretty good 'backbone' technology if, as I state in the article, you want to 'dispense with burrowing through dirty, spider- and snake-infested crawlspaces and drilling holes in walls and floors in order to route Cat5e cable around'. Wi-Fi conversely can be effective across intra-room and few-room spans...and with mobile devices.
  • Paedric - Thursday, September 1, 2011 - link

    Thanks for the article first, that's something I've been interested in for quite some time.

    However, I have a question; you tested it in a "perfect" environment by disabling interfering devices, to test the potential of the system, but what happen if it is not the case?
    Is the performance hit really noticeable?

    I don't want to rout a cable across the whole house, but I'm not really keen on turning off the fridge, lights, and unplugging devices every time I want to connect to the internet.
  • Denithor - Thursday, September 1, 2011 - link

    I have the TRENDnet TPL-303E2K Powerline AV Adapter Kit installed in my home, connecting my wireless router in the living room to my office computer about 50 or 60 feet away. Couldn't get a solid enough wireless signal in the office for any kind of gaming, hooked up this kit and within literally 2 minutes was playing everything just fine.

    There's no need to unplug or turn off anything. It just works...
  • gariig - Thursday, September 1, 2011 - link

    I bought my parents the same TRENDnet that Denithor has (crazy coincidence) because their wireless router and extra computer are on the other side of a ~2000 SQ FT house. Works flawlessly for normal computer usage (e-mail, Youtube, etc) and printer sharing. I don't know how well it works for large file transfers but I'd imagine you'll at least get 100 mbps
  • bdipert - Thursday, September 1, 2011 - link

    It depends. That's the only meaningful answer I can offer. That's why, after much gnashing of teeth and back-and-forth waffling, I decided to do my testing with everything turned off and disconnected. Otherwise, if (say) I had an especially noisy refrigerator motor, my results might have unfairly undershot some alternative typical-refrigerator reality. Obviously, my data wasn't the absolute best case...as I mentioned, I stuck with DHCP address assignments for the two Endpoints, instead of hard-wiring static IP addresses, and I concurrently ran all available powerline networking adapters although only three were in active use at any point in time, and I chose outlets out of functional meaningfulness to me, intentionally ignoring whether or not they spanned multiple breakers, or jumped across phases, in the process. But I also don't think it would have been right to turn on all potential interference sources, then do the tests.

    With that said, I regularly sling ~20 Mbps Windows Media Center streams (HD ATSC recordings) around my LAN, including through powerline spans, with no problem.
  • leexgx - Thursday, September 1, 2011 - link

    just would of been nice if you had done an short test with stuff on to see how it is handled them (just 1 page short tests) as you did it with every thing off

    you could of had an laptop with you to monitor each power plug speeds when stuff came on, last power plugs I used the speeds stated seem close to bandwidth useable (-50 ish % for overhead)

    I found power plugs to be very reliable and how they handle packet loss as well most of the time (last time I played with them)
  • Joe Martin - Thursday, September 1, 2011 - link

    Does it work for streaming video or not? Very hard to read article.
  • bdipert - Thursday, September 1, 2011 - link

    It's impossible for me to provide a simple answer to such a question without either undershooting or overshooting the spectrum of possible realities. First off, there's the bandwidth potential of any two powerline nodes in YOUR particular setup to consider...only you can measure and ascertain that. Then you've gotta determine what you mean by 'streaming video'...are we talking about a 20 Mbps encapsulated MPEG-2 (ATSC) HD stream coming from a Windows Media Center server, for example, or a heavily compressed sub-1 Mbps H.264 standard-definition video stream? Protocol? Etc...

Log in

Don't have an account? Sign up now