(Perhaps I should have chosen a less demanding game.)
I haven’t seen any other discussion regarding this, but if there has been any, my apologies!
Installation was straight forward. Here’s a guide:
Back when I had things running on my rpi4, I had it fairly overclocked. I’m guessing this is something we’ll need fo look at doing on the Devterm eventually, no doubt tweaking the kernel like what we did with the Gameshell. I do seem to remember doing a lot more on the rpi, but I think that was prior to their own proprietary 64 bit OS, and I had to manually set up a 64 bit environment. This was years ago. The Devterm is now.
I told myself I wouldn’t turn this into a game machine, and here I am. Whoops.
Were you running the DevTerm at stock settings? Because stock for the A06 downclocks the little cores to 1GHz and turns the big cores off altogether. You have to change the settings to get full performance - which should sit somewhere above a Raspberry Pi 4, right up until the thermal limit kicks.
Hey, ah yes I forgot to mention that enabled the “gear” shift profiles. Haha so many things I take for granted.
I haven’t done a glxgears readout yet to see what is being utilised. I have a feeling not everything is behaving as it should yet. For a start, having all 6 cores active would be nice.
Was this what you were referring to? If you’ve already done anything with this, please share what you know!
I just touched the tip of the ice berg regarding this, just to see if it works. I’m guessing some tweaks to the kernel will be needed to get the most out of this!
And yup, core wise the CPU is practically the same between a rpi4 and the Devterm A06. It’s the Broadcom GPU that’s better than the video core one from RPi. (I might have gotten the names mixed up - just going from memory)
Anyway. I am keen to explore more. Just waiting till the weekend!
Yes, although you can also adjust things manually. In my testing, running the DevTerm in “gear six” - stock clocks, all six cores enabled - results in something that comfortably outperforms a Raspberry Pi 4 in bursty workloads:
The Raspberry Pi 4 has a Broadcom VideoCore VI GPU; the RK3399 in the DevTerm has an Arm Mali T860-MP4. All things being equal, the RK3399 should - and in other devices I’ve tested, does - comfortably beat the Raspberry Pi 4 for both CPU and GPU performance. Sadly, those thermal limits mean the DevTerm can’t quite manage it, at least for sustained workloads.
It would be interesting to see what could be done with a replacement rear case which offers room for a considerably bigger heatsink and fan…
I don’t think it’s working as an insulator, but it’s definitely not up to the job of cooling it at full-tilt - at least, not with that fan.
One thing that may help, though I haven’t yet tried it, is building a shroud between the blower on the extension board and the heatsink - even something as simple as a little cardboard tunnel - so that less of the air leaves the case by the vents before it even reaches the heatsink.
I might try and make something with pla plate tomorrow, re: shroud to direct the air. But ultimately, modification to the rear housing or an alternate one would be something needed in the long run; especially for people wanting to run a CM4 compute module with a HSF.
A Compute Module 4 runs quite a bit cooler than the A-06 RK3399 module. A passive heatsink is probably enough for that, but you’d still need a modified case if you’re trying to use a CM4 in a SODIMM adapter into the existing carrier board - there’s not enough clearance, for the adapter, the CM4, and a heatsink.
Precisely. So having a rear housing with a larger cavity will serve both as a means to incorporate improved cooling, and accomodate a CM4 compute module.
The gameshell had an alternate lego back cover, so I don’t see an alternate larger rear cover being too left of field. Then again, they never made an official light key integrated one. That was s community member.
Definitely something to look into, using the existing 3D files of the chassis.
Quick little update, showing videos of it running.
First video is from the pre recorded opening. I got excited, seeing it run at 34FPS, with pretty high accuracy with emulation.
The second video is the actual speed it runs. Ugh. 15FPS. Missing textures. Did I mention it’s running VERY SLOW?
The only thing I changed was a shift in the devterm’s gear. I’ll need to find out if it’s being thermally throttled. A bit hard right now, since Australia is going through a heat wave. Hello 40 degrees Celsius days!
I’m going to do a read around to see if anyone’s done an overclock, just before I turn my Devterm into an expensive paper weight/doorstop. It’s already uncomfortable to hold, and the fans aren’t really doing much.
The first number it prints out is the temperature in millidegrees Celsius - so 36875 is about 37°C. The rest of the numbers are the clock speeds of each of the six cores in the DevTerm. If you’re not running in gear six, some will say “<unknown>” - those are the cores that are switched off.
When you load the emulator, you’ll see the clock speeds jump to their maximum - then the temperature will start to climb. Once it hits 80°C, it’ll throttle - and you can watch the clock speeds sink.
Here’s a thermal torture test I did - representing an absolutely worst-case scenario. I mean, real terrible stuff - you’ll never get it as hot as this as quickly as this under a real-world workload.
You can see it starts throttling at around three seconds into the ten-minute workload(!)
This is, of course, why the DevTerm arrives configured to turn off the two big cores and lock the little cores at 1GHz:
Much better - but, obviously, you’re sacrificing performance.