<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
Good point -- "<span style="white-space: pre-wrap">How would I know if an installation was meeting the specs?<span
style="white-space: normal">"</span></span>
<div class="moz-cite-prefix"><br>
It *has* been done before. From a historical perspective... <br>
<br>
When TCPV4 was being defined and documented in RFCs (e.g., RFC
793), circa 1981, other activities were happening in the
administrative bureaucracy of the US government, outside the realm
of the "research community".<br>
<br>
The US Department of Defense, which purchases huge quantities of
electronic equipment, declared TCP to be a "DoD Standard" in the
early 1980s. Further, they changed their purchasing rules so that
all equipment purchased, which might need to communicate to other
equipment, had to implement TCP. If you wanted to sell your
networked products to the government, they had to implement TCP.
This caused industry to suddenly pay attention to what us crazy
researchers had done in creating this TCP thing.<br>
<br>
A separate piece of government, the US National Bureau of
Standards (now called NIST), defined a testing procedure for
verifying that a particular TCP implementation actually conformed
to the documented DoD Standard. Further, they also created a
program which would certify third-party labs as qualified to
perform those tests and issue conformance certificates. Such
conformance proof could be submitted by companies as part of their
sales process to supply equipment for DoD contracts.<br>
<br>
I remember this pretty well, since I set up one such TCP
Conformance Lab, got it certified, and we performed a lot of
testing and consulting to help traditional government contractors
figure out what TCP was all about and get their products certified
for DoD procurement. I've never learned who was orchestrating
those bureaucratic initiatives, but it seemed like a good idea.
There may have been other similar efforts in other countries over
the decades since 1981 that I don't know anything about.<br>
<br>
In the last 40+ years, AFAIK little else has happened for testing,
certification, or regulation of Internet technology. Hundreds,
perhaps thousands, of "standards" have been created by IETF and
others, defining new protocols, algorithms, and mechanisms for use
in the Internet. I'm not aware of any testing or certification
for any Internet technology today, or any way to tell is f any
product or service I might buy actually has implemented,
correctly, any particular "Internet Standard".<br>
<br>
Governments can create such mechanisms around important
infrastructures, and have done so for transportation and many
others. IMHO they could do the same for Internet, and seem to be
trying to do so. <br>
<br>
But to be effective the administrators, politicians, and
regulators need to know more about how the Internet works. They
could create "Conformance Labs". They could involve
organizations such as the Underwriters Lab in the US, CSA in
Canada, CE (European Conformity) et al.<br>
<br>
If they knew they could and decided they should .... Education...<br>
<br>
Jack Haverty<br>
<br>
On 10/12/23 12:52, Hal Murray via Nnagain wrote:<br>
</div>
<blockquote type="cite"
cite="mid:20231012195244.33D5228C241@107-137-68-211.lightspeed.sntcca.sbcglobal.net">
<pre class="moz-quote-pre" wrap="">
Jack Haverty said:
</pre>
<blockquote type="cite">
<pre class="moz-quote-pre" wrap="">A few days ago I made some comments about the idea of "educating" the
lawyers, politicians, and other smart, but not necessarily technically
adept, decision makers.
</pre>
</blockquote>
<pre class="moz-quote-pre" wrap="">
That process might work.
Stanford has run programs on cyber security for congressional staffers.
>From 2015:
Congressional Staffers Headed to Stanford for Cybersecurity Training
<a class="moz-txt-link-freetext" href="https://cisac.fsi.stanford.edu/news/congressional-staffers-headed-stanford-cybe">https://cisac.fsi.stanford.edu/news/congressional-staffers-headed-stanford-cybe</a>
rsecurity-training
</pre>
<blockquote type="cite">
<pre class="moz-quote-pre" wrap="">Today I saw a news story about a recent FCC action, to mandate "nutrition
labels" on Internet services offered by ISPs:
</pre>
</blockquote>
<pre class="moz-quote-pre" wrap="">
Is there a chicken-egg problem in this area?
Suppose I had a nutrition-label sort of spec for a retail ISP offering. How
would I know if an installation was meeting the specs? That seems to need a
way to collect data -- either stand alone programs or patches to existing
programs like web browsers.
Would it make sense to work on those programs now? How much could we learn if
volunteers ran those programs and contributed data to a public data base? How
many volunteers would we need to get off the ground?
Could servers collect useful data? Consider Zoom, YouTube, gmail, downloads
for software updates...
</pre>
</blockquote>
<br>
</body>
</html>