On Wednesday 25. November 2015 21.16.42 Paul Sokolovsky wrote:
Paul Boddie paul@boddie.org.uk wrote:
Again, I haven't kept up with this, so it's useful to know. I remember that the i386 architecture had support for larger address spaces, but I guess that it was more convenient to move towards the amd64 variant in the end.
The way it was pushed on everyone, yeah. And we'll see if the same is happening now with arm64 - just as you, I skipped x86_64 "revolution", so can't judge for sure, but as far as I can tell, yeah, it's being pushed pretty much. Which is only said for projects like EOMA68, because it's endless run, and all the careful selection of nice 32-bit SoCs risk going down /dev/null, being consumers soon will meet classic stuff with "wwwwhat? it's not 64-bit?"
Well, I used Alpha back in the 1990s and that was a 64-bit transition worth pursuing, given how much faster the architecture was than virtually everything else that was available to me at the time. But it's precisely the "it's not 64-bit?" thinking that leads to products like that Olimex one (and also the Qualcomm arm64 development board [*]) where, as I noted, the most significant current motivation - more memory headroom for those who need it - is completely ignored. I guess we'd both ask what the point of such products is, other than "preparing for the future" or whatever the usual sales pitch is.
[*] https://developer.qualcomm.com/hardware/dragonboard-410c
("Free graphics possible" according to the Debian Wiki's ARM 64 port page...
https://wiki.debian.org/Arm64Port
...which lists the gold-plated options for "preparing for the future" offered by vendors who don't really seem to be certain that it is the future at the moment.)
FUD? Ouch! Thanks for classifying some pretty innocent remarks in such a negative way.
Perhaps it was a bit strong, but we all know that EOMA68 project is rather overdue, and it feels that maybe - just maybe - something will materialize finally anytime soon. And maybe - just maybe - coming up with high-end 2Gb module is good idea to show that project can deliver "bleeding edge" spec, not perceivably lag behind the market midline.
At this point 2GB is pretty reasonable. Indeed, I'd be happy with 2GB for my own purposes: I did consider upgrading my desktop machine to 2GB a few months ago, but it would be much nicer to go with EOMA-68 devices instead. My personal justification for more than 1GB involves developing and testing stuff in User Mode Linux which isn't happy running with a small tmpfs, and although I could switch to other virtualisation technologies (distribution support just wasn't there when I started to use UML), I imagine that I'd still need more memory to make it all happy, anyway.
I could imagine using a couple of 1GB devices instead for such activities. Going down to 512MB could work, too, but might involve more attention to the distribution-level stuff. At some point, the extra work required in tailoring the software to work well with the hardware is work I don't really have time for, however. So, 256MB would present some awkward choices, given the state of various distributions today.
I'm not saying that everyone needs 2GB or even 1GB (or maybe even 512MB), but then again, I'm not a big media "consumer" on my hardware, so I do wonder what amount of memory would be the minimum to satisfy "most people".
But marketing it with "RAM is (or will soon be) the inhibiting factor around the adoption of single- board computers." is IMHO will only hurt the project, as its main goal (unless my memory plays tricks on me) is to deliver commodity hardware in the hands of people to do real-world things (and allow to sustainably reuse that hardware for doing even more real-world things).
Oh, it wasn't a marketing suggestion at all. I don't suggest playing the game of who can offer most RAM, but it is clear that the amount of RAM does influence potential buyers and their impressions of what they might be able to use the device for. One can certainly get away with offering something very cheap and saying that it is good enough for "lightweight tasks" or, given the shenanigans in the "educational" scene at the moment, as an almost "throwaway" piece of gear that occupies the attentions of certain kinds of users for a short while, especially with things like this Raspberry Pi Zero that was announced very recently, but I think that it serves people better to have something that can address more than their absolute basic needs. Otherwise, they'll say that "this cheap thing is all very well, but I want a proper laptop", or whatever.
So, one of scenario how it all may come up is that all this sustainability talk may be a marketing gimmick and there won't be much more sustainability in EOMA than freedom in some fairphone. It will be a toy for particular kind of hipsters, delivered to them in denim recycled bags. Luke gets rich and will drive a sustainable personally tuned Tesla, but not that rich to produce an open-hardware SoC. All that would be pretty sad finale after all these years.
Actually, Fairphone has come some way in terms of freedom: the Qualcomm SoC that's in the second product may even be supportable by Free Software. Meanwhile, Luke's Tesla would have to be personalised to stand out where I live. :-)
So, I hope the project will continue to educate how cool it's to run a home router with server capabilities with 256MB RAM instead of typical 32MB, even if it costs 3x (for starters, hope it gets more sustainable over time), rather than ship luxuries and proclaiming demise of 1GB single-board computers.
Well, a lot of these computers have to get to 1GB first before their demise. ;-) But it is certainly the case that you can deliver better (and better- supported) hardware to various kinds of devices, and for many of them a modest amount of memory (by desktop/laptop standards) would go a long way. I don't think I would ever dispute that.
Paul