We covered the LG Optimus 3D during its launch event yesterday, and the device has continued to draw our attention. Today we decided to track down an LG Optimus 3D to get a better impression of its 3D capture and playback capabilities, and to run some benchmarks. 

The Optimus 3D features (surprise) a 3D autostereoscopic display from LG. Both the IPS panel underneath, and the parallax barrier on top are LG's own. Most of the Android UI is of course 2D - the parallax barrier can be switched on and off or varied in intensity depending on the context. 

We got some time to check out the 3D capable YouTube application. It's not an extra or an add-on, but rather Google's own real official YouTube application. It just happens to have 3D support included. Right now that YouTube version is exclusive to the LG Optimus 3D, but it'll eventually move into the market and onto other phones as they too get 3D capture and playback support. YouTube puts a 3D icon next to videos that support 3D and will play them back on the Optimus 3D in landscape with the parallax barrier enabled. 

Our impressions of the parallax barrier display are relatively positive. Depth isn't too overwhelming at most settings, and can be changed in most 3D contexts by dragging a slider. When viewing the display, it's obvious that there are a set of optimal viewing angles for the parallax barrier. It's difficult to describe how the 3D effect looks from different angles - as you change your viewing angle across the display, the 3D effect comes and goes accordingly. It definitely requires some thought to position oneself appropriately. 

LG also worked with Gameloft on a number of titles, including N.O.V.A. 3D. We got to play around and were decently impressed. LG spent a lot of time going over which scenes should have negative or positive parallax, and includes a depth slider for intensity. Other 3D games also include depth sliders and even allow 3D to be disabled entirely. There are a bunch of other uses for 3D that LG is working on with, including more close work with Google in other applications.

3D video capture is supported in the native camera application. The interface is exactly the same as the one on the LG Optimus 2X, instead there's a 3D button. Press that, and the parallax barrier fires up, and you get a live 3D preview. 

Other than that, we've had a busy couple of days of benchmarking at Mobile World Congress. Yesterday we spent a lot of time with Samsung's recently announced Galaxy S II, featuring Samsung's Exynos SoC. The Exynos is Samsung's own dual-core Cortex A9 with an ARM Mali 400MP GPU. 

We measured competitive CPU performance from the Exynos in the Galaxy S II, however the GPU performance was a bit behind the Tegra 2. 

The big news is that we spent some time using and benchmarking LG's Optimus 3D. The Optimus 3D is the first smartphone to use TI's OMAP 4430 SoC. Like NVIDIA's Tegra 2, the 4430 integrates a pair of ARM Cortex A9 cores behind a shared 1MB L2 cache. The OMAP 4430 differs from Tegra 2 in three key ways:

1) TI integrates ARM's Media Processing Engine (MPE) enabling NEON support

2) TI has a dual-channel LPDDR2 memory controller (Tegra 2 only has a single-channel memory controller).

3) The OMAP 4430 integrates Imagination Technologies' PowerVR SGX 540 running at ~300MHz, NVIDIA uses its own GeForce GPU.

I don't expect there to be a measurable performance advantage today due to MPE as I'm not aware of any substantial (or any?) NEON code in the apps we test with. The dual-channel memory controller and higher clocked SGX 540 are another matter entirely.

Most PowerVR SGX 540 designs run the GPU core at up to 200MHz. OMAP 4's implementation is another 50% faster. LG's software build also uses a newer version of Imagination Technologies' driver (1.6 instead of 1.5) which fixes some rendering issues (specifically in the Egypt test) and improves performance considerably (likely between 10 - 30% in GLBenchmark2). You can see that the image quality issues are fixed in the video showing the Egypt demo running on the LG Optimus 3D below:

SunSpider Javascript Benchmark 0.9

Rightware BrowserMark

GLBenchmark 2.0 - Egypt

GLBenchmark 2.0 - PRO

Taking the performance improvement due to the driver out of the equation, we're still looking at a nearly 20% increase in performance over the SGX 540 in Hummingbird. The 4430 holds a similar advantage over NVIDIA's Tegra 2.

It's sort of nice to have so many different SoCs available at the same time. TI's OMAP 4430, Samsung's Exynos 4210, NVIDIA's Tegra 2 AP20H and Qualcomm's upcoming MSM8660. Right now it looks like the OMAP 4430 may be the best performer out of them all, however it remains to be seen what Qualcomm's Adreno 220 will bring to the table when the MSM8660 ships later this year. 

Comments Locked

29 Comments

View All Comments

  • MobiusStrip - Tuesday, February 15, 2011 - link

    This camera will not produce proper 3-D (meaning 3-D as humans see it).

    It can't, because it doesn't have the points of view that normally spaced human eyes do. Their field of view doesn't make any difference; the disparity between the images isn't enough to create a realistic 3-D effect.
  • name99 - Tuesday, February 15, 2011 - link

    "I don't expect there to be a measurable performance advantage today due to MPE as I'm not aware of any substantial (or any?) NEON code in the apps we test with."

    I know this comment was made about Android, but I think it is worth pointing out that Apple does make use of NEON code in iOS. At the most obvious level, there are Apple libraries (part of the iOS API) for various numerical code, like FFTs and linear algebra, that utilize NEON.

    While much of the numerical processing you'd expect in video/audio playback and core animation is presumably done in ASICs or on the GPU, I suspect that the voice recognition stuff, for example, (perhaps also some HDR stuff?) is done on NEON.
    (Or perhaps not yet --- damn that HDR stuff is slow, and maybe with iOS 5 we'll get a nice speed boost as it moves onto either NEON or the GPU?)
  • metafor - Tuesday, February 15, 2011 - link

    A lot of Android's UI library (prior to Honeycomb) is, I believe, done in NEON. Other than that, I'm not really aware of much. Then again, other than rendering websites, I don't know of many CPU-intensive tasks on phones at all.

    As phone software becomes more demanding, there may be more uses for NEON in code that's not easily offloaded to the GPU/DSP.
  • Vlad T. - Tuesday, February 15, 2011 - link

    Any chance to run Flash benchmark and see how it performs?
    http://images.anandtech.com/graphs/graph4177/35414...

    Thank you,
    Vlad.
  • Brian Klug - Wednesday, February 16, 2011 - link

    Sadly no, the Optimus 3D I spent a lot of time with didn't have Flash installed. We could've installed it from the market, but it's possible that Adobe and TI will work on something, just hard to tell whether the marketplace based flash would've been representative.

    -Brian
  • jeffrey - Tuesday, February 15, 2011 - link

    Does the LG Optimus 3D actually use 3 seperate CMOS sensors? I see what appears to be two on the back of the phone and is there a front-facing camera also? Does anyone know if this is physically two seperate sensors or some type of new trick with 2 lenses and one sensor?

    I wonder what the specs are of each camera are: Two 5mp sensors in back and one vga in front?
  • kenyee - Tuesday, February 15, 2011 - link

    Surprised they didn't use NVidia's Tegra 3D chip instead but this definitely ups the bar for the Tegra 3D :-)
  • Amon - Tuesday, February 15, 2011 - link

    Hi,

    I found these tests very interesting but I'm curious as to wether the testing software is able to fully utilize and test both CPUs and in the case of the Samsung Galaxy S II all four GPUs.

    Should we expect to see the dual-core CPUs and the quad-core GPU do better with newer versions of the testing software?

    Also do you have any comment on the difference between the Optimus 2X and the Atrix in most tests? I had expected the Atrix to beat the Optimus 2X?

    Thanks.
  • TareX - Wednesday, February 16, 2011 - link

    You need to consider the fact the Atrix has a higher resolution screen, and is running an older version of Android :)

Log in

Don't have an account? Sign up now