On Wed, Jan 10, 2018 at 4:11 PM, Jonathan Neuschäfer j.neuschaefer@gmx.net wrote:
On Wed, Jan 10, 2018 at 01:59:19PM +0000, Luke Kenneth Casson Leighton wrote:
so they would be perfectly and absolutely within their rights to take the Card out, plug it into a.... testing station of some kind (even if that's just a breakout board), plug in a MicroSD card with "UART debug enabled" and off they go.
Although I don't like this solution (what if the on-card firmware wants to print debug logs?), I have to acknowledge that it is a solution. (I would prefer that early debug output is available without (even temporarily) rendering a CPU card non-compliant.)
another solution is: the designer of the Card ensures that, through the user-facing connectors of the Card, that early debug messages may be accessed through one of the peripherals. in the case of the EOMA68-A20 Card, that is possible with the insertion of a "MicroSD Breakout Board", whereupon both JTAG and UART0 are accessible if you boot with the correct pinmux settings.
this, again, does NOT require that the UART wires of EOMA68 be placed into a state that results in total confusion over their use and purpose... more on this below.
no it's even more basic / simpler than that. they don't have to do that, they can just put the Card into a break-out PCMCIA holder socket. or anything else.
... and enable early debug mode through unspecified means, right?
no, it's very clearly specified that there *is* no limit or restriction. which is totally different from leaving things "unspecified".
In short: Thank you for the clarification. Now I disagree with this decision in the spec. :-/
ok, so bear in mind that the UART wires double up as GPIO, and that it is the HOUSING DESIGNER's right, under the EOMA68 specification, to make the decision to allocate eiher one (or both) of the UART wires to GPIO - as either Input or Output.
This is AFAICS not a big problem under my suggested change.
For reference, this is my suggested change:
- CPU Cards may use the UART lines for debug purposes while they are not fully (enough) booted.
ok so that implies that the UART lines MAY be UART... they MIGHT also be GPIO. please bear in mind: anything that involves "confusion" is AUTOMATICALLY prohibited from being included in the EOMA68 Standard. i appreciate that in this case you describe a procedure that would remove doubt, but the procedure itself is very burdensome to implement.
there is a story i told about the X25 Standard which illustrates how these kinds of choices result in lost opportunities and/or total confusion and thus destroy confidence in a standard.
- When a CPU card has fully (enough) booted, it must use the UART pins in the function that's described in the EEPROM.
ok so the boot process you propose is:
* bring up the CPU (DDR3, PLLs) * initialise EEPROM GPIO pins and configure them as I2C * read an I2C EEPROM * decode it * work out if it's SAFE to write to UART * THEN write debug / print messages on the UART pins
... can you see how that's not "early" at all?
For example, UART connected to a Bluetooth module, GPIO connected to whatever, etc.
- If a housing needs to protect its components from debug traffic, it must provide (and describe in the EEPROM) a mechanism for the CPU card to signal that it has booted far enough to use the UART pins for the function intended by the housing. This can be done through a I2C register poke, toggling a (different) GPIO line, etc.[2]
this is _way_ too complicated, and also not clear.
- I think it should be valid for a CPU card to follow the current model and keep the UART pins tri-stated until it's fully booted. A housing that wants to capture early debug traffic can generate a well-defined idle signal on the TX line with a pull-up.
this is even more complicated... and also unnecessary when the person doing the debugging may either:
* in-situ use multiplexing of user-facing connectors (A20 MicroSD / UART-JTAG capability) * take the Card out of the Housing and test it in a home-made or laboratory-owned test Housing.
This is a debug facility. Not all CPU cards have to use it, but all housings must accept it.
that places a huge technical burden and complexity on Housing Designers *and* Card Designers, where no such complexity or burdensome requirements exist at the moment.
Thus it is just as (non-)optional as USB, with the difference that the CPU card decides whether it prints early debug messages, and the Housing decides whether it connects the USB pins to any USB devices or connectors.
the purpose of requiring the "non-optionality" is to ensure that there is absolutely no way that a future Card or future Housing will be incompatible with an older Card or an older Housing, no matter how much faster the peripherals on either the Card(s) or the Housing(s) have become.
SD/MMC itself is a perfect example, as not only is the speed auto-negotiated based on how many physical wires "happen to connect" - 1 2 or 4 - but there is also *automatic* host-to-card negotiation of speed capabilities *built into the protocol*. likewise SATA and USB, and also the ADSL / SDSL broadband protocol, and also PCIe. all of these protocols do "negotiation", right down to the VERY slowest, oldest possible equipment that could possibly be plugged in.
all of these protocols are incredibly simple as far as EOMA68 is concerned: they "take care of themselves". UART, SPI, I2C, RGB/TTL and the three degenerate cases GPIO, PWM and EINT on the other hand, cannot "self-negotiate". that's what the I2C EEPROM is for: to describe and specify those functions totally unambiguously so that the Card is GUARANTEED - as long as it follows the EOMA68 specification - NOT to do any damage to itself or to the Housing.
the other consideration i have is that the standard has to be simple, and implementation of Housings and Cards has to be very straightforward. what you are proposing has two possible alternatives (actually a third is as you suggested: open up the Card's case and start re-wiring things and hand-soldering on extra components or connections) where all of the alternatives achieve exactly the same thing.... *without *requiring that the EOMA68 Standard have additional complexity added to it.
"Fully (enough) booted" in the above doesn't just mean the CPU has left the bootloader. It also has to have read the I2C EEPROM, which might require quite a bit of work in the kernel (initializing the I2C controller, at least). Things can go wrong before the CPU card has booted far enough before it can read and interpret the I2C EEPROM, which is my whole motivation.
exactly. and that's precisely why additional complexity should *not* be added to the negotiation phase.
.... so what do you think would happen, in this case, if someone plugged in a Card where it was FORCIBLY REQUIRED that UART *ABSOLUTELY MUST* transmit "early boot messages" on those two wires?
Required by which part?
sorry, the use of the word "part" is not clear. part of the standard? part of the Card? part of the Housing? part of the proposed modification to the standard?
anyway, that was written when i believed that you were proposing that Cards are forced to transmit early boot messages over UART.
- Housings shouldn't require to see any debug messages from CPU cards, that's just silly
the point is: if the wires are to be forced to transmit early debug messages, then in ABSOLUTELY NO WAY can they also be allowed to change over to GPIO. there must be ABSOLUTELY NO POSSIBLE RISK that their use as UART could conceivably cause damage to either the CPU or to the Housing components.
and if there is a current fight between a GPIO that is tied permanently to VREFTTL on the Housing and the forced requirement to transmit UART early debug messages tries to pull it high, we have a serious problem.
i appreciate that you have come up with a solution to this, involving a complex process of ascertaining via the EEPROM whether the pins are GPIO or UART, but it is complexity where *none* exists at the moment, and there are two easy alternatives that place absolutely no burden whatsoever on the Technical EndUser.
i trust that you can appreciate that it would overwhelm both me and you and everyone on this list to have three ongoing *simultaneous* separate and distinct highly-technical logical reasoning discussions, yeh?
Yes.
cool, whew :) this _is_ really important, to make absolutely sure that the standard will be useful and useable for at least the next decade.
thanks johnathon.
l.