Trying out the Ryzen 5 3400G Part 1: Warframe Orokin Derelict and Plains of Eidolon.

Updated: Sep 4, 2019

By the way, I am not recording Frame Rate for academic comparison purposes in this video series. I'm just playing my favourite games on this processor, in this post it's Warframe. Later I may do an actual benchmark test with graphs and all that.

So I got a Ryzen 5 3400G "APU" (I don't think AMD even call it that anymore), or more accurately "Processor with Graphics". This part, called "Picasso", is essentially the "Raven Ridge" silicon from Ryzen 5 2400G, and ported to GlobalFoundries' newer 12nmLP process resulting in better efficiency and clock speeds, both of which are very nice to have.

In addition, AMD tightened up timings on the caches, L1 to 3, and improved the memory controller also. The CPU is conforming to what AMD classifies as "Zen+" (2nd Generation Ryzen). Whereas the original 14nm Raven Ridge was closer to 1st generation Ryzen in latency, but slightly improved.

Anyway, the CPU clocks are much higher, with 3.7 GHz base and 4.2 GHz boost, and I am observing 4.1 - 4.05 GHz on all cores in multi-core workloads, and 3.9 - 3.95 GHz on all cores in heavy stress (100% load). Essentially meaning the light max-turbo clock of its predessor (2400G) is now the average all-core boost clock of the 3400G. A pretty nice step of 200-300 MHz, and you get slight IPC improvements from tighter latency on caches and IMC, on top of that. Oh look, I typed a ton of stuff.

Anyway, the video. I tested the new Processor with Graphics in my favourite game, that is of course - Warframe. The important system specs are follows:

  • Ryzen 5 3400 G @ stock w/ Wraith Spire (Copper) with Wraith Stealth fan attached

  • MSI B350M Mortar (VRM Heatsink removed, don't ask, is fine for 65W)

  • 2x8GB 2400 MHz, CL14-16-16 DDR4 (no single channel gimping here)

  • Integrated Radeon Vega 11 Graphics @ stock (1400 MHz max)

Yes, the RAM is slow, please don't get mad at me for that. It's all I have on hand for this PC, and I'm not pulling the 3200 MHz kit out of my desktop, either. Consider it more accurate a test of what someone would pair this relatively inexpensive processor with, they are not likely to want to spend more on RAM than on the Processor itself, right? Anyway, I may test much faster RAM later, and I also plan to do a bit of overclocking on this kit, too. That will come a bit later.

Okay so without further ado, here is my first video in my testing.

Note: For those that won't read the video description, the On-Screen Display does this weird derping, I actually noticed it on my desktop too, periodically. It just seems to happen more on the OSD when MSI Afterburner is recording. An RTSS bug, I guess.

Secondly, I am using Software encoder (CPU) for recording the footage, as I don't want to go through the hoops required to set up the Relive software to use the chip's video engine for encoding. (I will wait for official driver). It's of little consequence, anyway, as the recording compression is set to use just one thread, and has almost no impact on the performance.

Lastly, the video is displayed in 1280x720 resolution, this is because YouTube doesn't support 1440x900 native videos. The game is running at 1440x900, just the video on YouTube is 720p.


Performance is good. Well, you can watch the video to see it for yourself. Sorry about the poor quality, YouTube loves butchering my videos into a mess of compression artifacts. >_>

I made the On-Screen Display bigger so it's easier to see the statistics. If you "TLDW" (That's Too Long Didn't Watch), then it can be summarised by saying in the Interior scene of the Orokin Derelict, the processor and its onboard graphics easily managed over 60 FPS at 1440x900, near max settings essentially all of the time. In the most demanding area of the game as of typing this, the Plains of Eidolon, the frame rate is in the 40-50 FPS range, at maximum detail settings. Except for GPU particles, they are on "high", because the game won't let me select "ludicrious" on this setup for some reason.

I'm probably going to test at 1080p later, also. Since I think that is sort of the "Holy Grail" of integrated graphics, decent-settings and frame rate, gaming.

©2020 by Sashleycat (Read the Legal stuff)