From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from bobcat.rjmcmahon.com (bobcat.rjmcmahon.com [45.33.58.123]) (using TLSv1.2 with cipher ADH-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by lists.bufferbloat.net (Postfix) with ESMTPS id 440193B29D for ; Fri, 13 Oct 2023 16:50:18 -0400 (EDT) Received: from mail.rjmcmahon.com (bobcat.rjmcmahon.com [45.33.58.123]) by bobcat.rjmcmahon.com (Postfix) with ESMTPA id 8DAD01B258; Fri, 13 Oct 2023 13:50:17 -0700 (PDT) DKIM-Filter: OpenDKIM Filter v2.11.0 bobcat.rjmcmahon.com 8DAD01B258 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=rjmcmahon.com; s=bobcat; t=1697230217; bh=a5WEG4oS3BiP4rB/fNpimUkvXJTaq5hwYywTNXBHSpg=; h=Date:From:To:Cc:Subject:In-Reply-To:References:From; b=a0CR2Tixy25gLIxYSIVV5+niV3BwtjFE+hDLxC4OF1kCq7VhZYLt2okRV4xE3FUsR stIbUPatFo+2PlDHeU+vLKguHSyR4/mUkWOygVjQwkx/M3gVkAotHWKugXoIo+33Tn 1i/KichtiFlLEfOeKMH0b5ItAxfbIbe6JZf1q3l4= MIME-Version: 1.0 Date: Fri, 13 Oct 2023 13:50:17 -0700 From: rjmcmahon To: =?UTF-8?Q?Network_Neutrality_is_back!_Let=C2=B4s_make_the_technical_a?= =?UTF-8?Q?spects_heard_this_time!?= In-Reply-To: <28b55c94-ea6b-4926-ad53-bd4c0ecb1a6b@3kitty.org> References: <20231012195244.33D5228C241@107-137-68-211.lightspeed.sntcca.sbcglobal.net> <28b55c94-ea6b-4926-ad53-bd4c0ecb1a6b@3kitty.org> Message-ID: <1f93c14727a2baeb24fe875a7bd21f38@rjmcmahon.com> X-Sender: rjmcmahon@rjmcmahon.com Content-Type: text/plain; charset=US-ASCII; format=flowed Content-Transfer-Encoding: 7bit Subject: Re: [NNagain] Internet Education for Non-technorati? X-BeenThere: nnagain@lists.bufferbloat.net X-Mailman-Version: 2.1.20 Precedence: list List-Id: =?utf-8?q?Network_Neutrality_is_back!_Let=C2=B4s_make_the_technical_aspects_heard_this_time!?= List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Fri, 13 Oct 2023 20:50:18 -0000 As an open-source maintainer of iperf 2, which is basically a network socket & traffic tool, I find this history extremely interesting. Releasing a measurement tool free to all, with transparent code, allows everyone access to a "shared yardstick." While maybe not enough, hopefully, it helps a little bit to those 40+ years of not much. Bob > Good point -- "How would I know if an installation was meeting the > specs?" > > It *has* been done before. From a historical perspective... > > When TCPV4 was being defined and documented in RFCs (e.g., RFC 793), > circa 1981, other activities were happening in the administrative > bureaucracy of the US government, outside the realm of the "research > community". > > The US Department of Defense, which purchases huge quantities of > electronic equipment, declared TCP to be a "DoD Standard" in the early > 1980s. Further, they changed their purchasing rules so that all > equipment purchased, which might need to communicate to other > equipment, had to implement TCP. If you wanted to sell your networked > products to the government, they had to implement TCP. This caused > industry to suddenly pay attention to what us crazy researchers had > done in creating this TCP thing. > > A separate piece of government, the US National Bureau of Standards > (now called NIST), defined a testing procedure for verifying that a > particular TCP implementation actually conformed to the documented DoD > Standard. Further, they also created a program which would certify > third-party labs as qualified to perform those tests and issue > conformance certificates. Such conformance proof could be submitted > by companies as part of their sales process to supply equipment for > DoD contracts. > > I remember this pretty well, since I set up one such TCP Conformance > Lab, got it certified, and we performed a lot of testing and > consulting to help traditional government contractors figure out what > TCP was all about and get their products certified for DoD > procurement. I've never learned who was orchestrating those > bureaucratic initiatives, but it seemed like a good idea. There may > have been other similar efforts in other countries over the decades > since 1981 that I don't know anything about. > > In the last 40+ years, AFAIK little else has happened for testing, > certification, or regulation of Internet technology. Hundreds, > perhaps thousands, of "standards" have been created by IETF and > others, defining new protocols, algorithms, and mechanisms for use in > the Internet. I'm not aware of any testing or certification for any > Internet technology today, or any way to tell is f any product or > service I might buy actually has implemented, correctly, any > particular "Internet Standard". > > Governments can create such mechanisms around important > infrastructures, and have done so for transportation and many others. > IMHO they could do the same for Internet, and seem to be trying to do > so. > > But to be effective the administrators, politicians, and regulators > need to know more about how the Internet works. They could create > "Conformance Labs". They could involve organizations such as the > Underwriters Lab in the US, CSA in Canada, CE (European Conformity) et > al. > > If they knew they could and decided they should .... Education... > > Jack Haverty > > On 10/12/23 12:52, Hal Murray via Nnagain wrote: > >> Jack Haverty said: >> >>> A few days ago I made some comments about the idea of "educating" >>> the >>> lawyers, politicians, and other smart, but not necessarily >>> technically >>> adept, decision makers. >> >> That process might work. >> >> Stanford has run programs on cyber security for congressional >> staffers. >> >> From 2015: >> Congressional Staffers Headed to Stanford for Cybersecurity Training >> > https://cisac.fsi.stanford.edu/news/congressional-staffers-headed-stanford-cybe >> rsecurity-training >> >>> Today I saw a news story about a recent FCC action, to mandate >>> "nutrition >>> labels" on Internet services offered by ISPs: >> >> Is there a chicken-egg problem in this area? >> >> Suppose I had a nutrition-label sort of spec for a retail ISP >> offering. How >> would I know if an installation was meeting the specs? That seems >> to need a >> way to collect data -- either stand alone programs or patches to >> existing >> programs like web browsers. >> >> Would it make sense to work on those programs now? How much could >> we learn if >> volunteers ran those programs and contributed data to a public data >> base? How >> many volunteers would we need to get off the ground? >> >> Could servers collect useful data? Consider Zoom, YouTube, gmail, >> downloads >> for software updates... > _______________________________________________ > Nnagain mailing list > Nnagain@lists.bufferbloat.net > https://lists.bufferbloat.net/listinfo/nnagain