List of Archived Posts

2005 Newsgroup Postings (06/22 - 07/05)

Book on computer architecture for beginners
The Worth of Verisign's Brand
The Worth of Verisign's Brand
The Worth of Verisign's Brand
SHARE 50 years
Book on computer architecture for beginners
SHARE 50 years?
Signing and bundling data using certificates
derive key from password
Jack Kilby dead
who invented CONFIG/SYS?
The Worth of Verisign's Brand
The Worth of Verisign's Brand
The Worth of Verisign's Brand
The Worth of Verisign's Brand
The Worth of Verisign's Brand
Newsgroups (Was Another OS/390 to z/OS 1.4 migration
The Worth of Verisign's Brand
The Worth of Verisign's Brand
Bank of America - On Line Banking *NOT* Secure?
Newsgroups (Was Another OS/390 to z/OS 1.4 migration
The Worth of Verisign's Brand
The Worth of Verisign's Brand
The Worth of Verisign's Brand
The Worth of Verisign's Brand
PKI Crypto and VSAM RLS
ESCON to FICON conversion
How does this make you feel?
IBM/Watson autobiography--thoughts on?
Importing CA certificate to smartcard
A good argument for XML
More Phishing scams, still no SSL being used
More Phishing scams, still no SSL being used
More Phishing scams, still no SSL being used
More Phishing scams, still no SSL being used
More Phishing scams, still no SSL being used
More Phishing scams, still no SSL being used
More Phishing scams, still no SSL being used
IBM/Watson autobiography--thoughts on?
Safe to transmit (symmetric) key encrypted with itself?
IBM/Watson autobiography--thoughts on?
25% Pageds utilization on 3390-09?
More Phishing scams, still no SSL being used

Book on computer architecture for beginners

Refed: **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: Book on computer architecture for beginners
Newsgroups: comp.arch,alt.folklore.computers
Date: 22 Jun 2005 11:45:27 -0600
keith writes:
ALDs, that Del refers to, were printed on 1403s using a special "sideways" chain and were printed across the crease, so two sheets made one ALD page. When the 3800 came out, P'ok had to install a special RPQ to allow over-the-crease printing, before the 1403s could be replaced.

printing side-ways ... the vertical lines would be printed horizontal on a line (which always connected). you then need to set the printer to 8lines/inch (instead of 6) making the space between lines much smaller and have a slightly larger line character ... which would print the document horizontal lines down the vertical page.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

Refed: **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 22 Jun 2005 14:09:00 -0600
"Anders Rundgren" writes:
Lynn,

Some TTP CAs actually *do* require RP contracts.

The "only" problem with that is that this is usually also connected to RP authentication to OSCP services for payment purposes.

So even if the certs are stale the information is dynamically verified.

Unfortunately this kind of PKI has scalability issues and their promoters (banks) are only losing money. Therefore VeriSign's model in spite of its contractual weaknesses still seems to reign.

Maybe you are exaggerating the need to sue CAs for huge sums?

That DNSSEC could kill the SSL CA is probably true but it seems that DNSSEC suffers from a dysfunctional business model. The SSL PKI has a working business model.


we were asked to work with this small client/server startup in silicon valley that wanted to do payment transactions on their server ... and they had this stuff called SSL

https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3

in the year we worked with them ... they moved and changed their name. trivia question ... who had owned the rights to their new name?

as part of the effort on doing the thing called a payment gateway and allowing servers to do payment transactions ... we also had to perform business due diligence on most of these operations that produced these things called SSL domain name certificates. At the time we coined the term certificate manufacturing ... to differentiate what most of them were doing from this thing called PKI (aka that don't actually really have any operational business process for doing much more than pushing the certificates out the door).

It was also when we coined the term merchant comfort certificates (since it made the relying parties ... aka the consumers ... feel better).

we also originated the comparison between PKI CRLs and the paper booklet invalidation model used by the payment card industry in the 60s. when some number of people would comment that it was time to move payment card transactions into the modern world using digital certificates ... would pointed out to them ... rather than modernizing the activity ... it was regressing the operations by 20-30 years.

Another analogy for certificates is the offline payment card world of the 50s & 60s ... which had to mail out invalid account booklets on a monthly basis ... and then as the number of merchants, card holders and risks increased ... they started going to humongous weekly mailings. At least they had a record of all the relying parties (aka merchants) ... which typical PKI operation has no idea what-so-ever who the relying parties are.

It was sometime after we started pointing out that PKIs really had a business model oriented at offline business environment ... which would result in regressing many business operations decades if it was force fit on them ... that you saw OCSP come on the scene.

OCSP doesn't actually validate any information ... it is just there to validate whether any information that might be in a certificate is actually still valid. The CRL model is known to not scale ... as found out in the payment card industry going into the 70s.

PKIs imply the administration and management of the trust information. In the offline certificate model ... either they have to have a list of all possible relying parties and regularly push invalidation lists out to them ... or they provide an online service which allows relying parties to check for still valid. However, as you point out that the PKI administration and management of any kind doesn't really scale ... which resulted in actual deployments being simple certificate manufacturing (instead of real PKI).

The payment card industry also demonstrated the lack of scaling of the certificate-like offline model scalling model in the 70s when they converted to an online model for the actual information.

Part of the problem with the online OCSP model is that has all the overhead of an online model with all the downside of the offline implementation aka not providing the relying party access to real online, timely actual information ... things like timely aggregation information (current account balance) or timely sequences of events (saw for fraud detection).

part of the viability for no/low-value market segment is to stick with simple certificate manufacturing and don't actually try to manage and administrate the associated information trust.

random past certificate manufacturing posts
https://www.garlic.com/~lynn/aepay2.htm#fed Federal CP model and financial transactions
https://www.garlic.com/~lynn/aepay2.htm#cadis disaster recovery cross-posting
https://www.garlic.com/~lynn/aepay3.htm#votec (my) long winded observations regarding X9.59 & XML, encryption and certificates
https://www.garlic.com/~lynn/aadsm2.htm#scale Scale (and the SRV record)
https://www.garlic.com/~lynn/aadsm2.htm#inetpki A PKI for the Internet (was RE: Scale (and the SRV
https://www.garlic.com/~lynn/aadsm3.htm#kiss7 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm5.htm#pkimort2 problem with the death of X.509 PKI
https://www.garlic.com/~lynn/aadsm5.htm#faith faith-based security and kinds of trust
https://www.garlic.com/~lynn/aadsm8.htm#softpki6 Software for PKI
https://www.garlic.com/~lynn/aadsm8.htm#softpki10 Software for PKI
https://www.garlic.com/~lynn/aadsm8.htm#softpki14 DNSSEC (RE: Software for PKI)
https://www.garlic.com/~lynn/aadsm8.htm#softpki20 DNSSEC (RE: Software for PKI)
https://www.garlic.com/~lynn/aadsm9.htm#cfppki5 CFP: PKI research workshop
https://www.garlic.com/~lynn/aadsmore.htm#client4 Client-side revocation checking capability
https://www.garlic.com/~lynn/aepay10.htm#81 SSL certs & baby steps
https://www.garlic.com/~lynn/aepay10.htm#82 SSL certs & baby steps (addenda)
https://www.garlic.com/~lynn/aadsm11.htm#34 ALARMED ... Only Mostly Dead ... RIP PKI
https://www.garlic.com/~lynn/aadsm11.htm#39 ALARMED ... Only Mostly Dead ... RIP PKI .. addenda
https://www.garlic.com/~lynn/aadsm13.htm#35 How effective is open source crypto? (bad form)
https://www.garlic.com/~lynn/aadsm13.htm#37 How effective is open source crypto?
https://www.garlic.com/~lynn/aadsm14.htm#19 Payments as an answer to spam (addenda)
https://www.garlic.com/~lynn/aadsm14.htm#37 Keyservers and Spam
https://www.garlic.com/~lynn/aadsm15.htm#0 invoicing with PKI
https://www.garlic.com/~lynn/aadsm19.htm#13 What happened with the session fixation bug?
https://www.garlic.com/~lynn/98.html#0 Account Authority Digital Signature model
https://www.garlic.com/~lynn/2000.html#40 "Trusted" CA - Oxymoron?
https://www.garlic.com/~lynn/2001d.html#7 Invalid certificate on 'security' site.
https://www.garlic.com/~lynn/2001d.html#16 Verisign and Microsoft - oops
https://www.garlic.com/~lynn/2001d.html#20 What is PKI?
https://www.garlic.com/~lynn/2001g.html#2 Root certificates
https://www.garlic.com/~lynn/2001g.html#68 PKI/Digital signature doesn't work
https://www.garlic.com/~lynn/2001h.html#0 PKI/Digital signature doesn't work
https://www.garlic.com/~lynn/2001j.html#8 PKI (Public Key Infrastructure)
https://www.garlic.com/~lynn/2003.html#41 InfiniBand Group Sharply, Evenly Divided
https://www.garlic.com/~lynn/2003l.html#36 Proposal for a new PKI model (At least I hope it's new)
https://www.garlic.com/~lynn/2003l.html#45 Proposal for a new PKI model (At least I hope it's new)
https://www.garlic.com/~lynn/2003l.html#46 Proposal for a new PKI model (At least I hope it's new)
https://www.garlic.com/~lynn/2004m.html#12 How can I act as a Certificate Authority (CA) with openssl ??

random past comfort certificate postings:
https://www.garlic.com/~lynn/aadsm2.htm#mcomfort Human Nature
https://www.garlic.com/~lynn/aadsm2.htm#mcomf3 Human Nature
https://www.garlic.com/~lynn/aadsm2.htm#useire2 U.S. & Ireland use digital signature
https://www.garlic.com/~lynn/aadsm3.htm#kiss5 Common misconceptions, was Re: KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss7 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsmail.htm#comfort AADS & X9.59 performance and algorithm key sizes
https://www.garlic.com/~lynn/aepay4.htm#comcert Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert2 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert3 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert4 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert5 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert6 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert7 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert8 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert9 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert10 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert11 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert12 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert13 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert14 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert15 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert16 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert17 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay6.htm#dspki use of digital signatures and PKI
https://www.garlic.com/~lynn/aepay10.htm#80 Invisible Ink, E-signatures slow to broadly catch on (addenda)
https://www.garlic.com/~lynn/2000c.html#32 Request for review of "secure" storage scheme
https://www.garlic.com/~lynn/2001c.html#62 SSL weaknesses
https://www.garlic.com/~lynn/2003l.html#43 Proposal for a new PKI model (At least I hope it's new)
https://www.garlic.com/~lynn/2004b.html#39 SSL certificates
https://www.garlic.com/~lynn/2004c.html#43 why and how VeriSign, thawte became a trusted CA?
https://www.garlic.com/~lynn/2004i.html#4 New Method for Authenticated Public Key Exchange without Digital Certificates

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 22 Jun 2005 14:57:15 -0600
Anne & Lynn Wheeler writes:
we were asked to work with this small client/server startup in silicon valley that wanted to do payment transactions on their server ... and they had this stuff called SSL
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3

in the year we worked with them ... the moved and changed their name. trivia question ... who had owned the rights to their new name?

as part of the effort on doing the thing called a payment gateway and allowing servers to do payment transactions ... we also had to perform business due diligence on most of these operations that produced these things called SSL domain name certificates. At the time we coined the term certificate manufacturing ... to differentiate what most of them were doing from this thing called PKI (aka that don't actually really have any operational business process for doing much more than pushing the certificates out the door).


for some topic drift ... the new buzzword is SOA (service oriented architecture) ... i've recently commented that the payment gateway might be considered the original SOA application.

misc. refs:
https://www.garlic.com/~lynn/2005i.html#42 Development as Configuration
https://www.garlic.com/~lynn/2005i.html#43 Development as Configuration
https://www.garlic.com/~lynn/2005i.html#44 SqlServerCE and SOA - an architecture question
https://www.garlic.com/~lynn/2005i.html#48 defeating firewalls made easy
https://www.garlic.com/~lynn/2005k.html#2 Ancient history

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 22 Jun 2005 15:13:41 -0600
"Anders Rundgren" writes:
Lynn,

Some TTP CAs actually *do* require RP contracts.

The "only" problem with that is that this is usually also connected to RP authentication to OSCP services for payment purposes.

So even if the certs are stale the information is dynamically verified.


so this is the stale, static, redundant and superfluous scenario.

the problem with OSCP services is that it supposedly just says yes/no as to whether the stale, static certificate information is still applicable or not.

as mentioned ... this has all the overhead of having an online service w/o any of the benefits.

the payment infrastructure moved out of this offline (certificate-like), archaic design in the 70s with online authentication and authorization with timely online access to the actual, real information ... like aggregated information of sequences of operations. This resulted in things like support for fraud detection patterns and current account balance. the current account balance represents the starting value (which you might or not might consider including in a stale, static, redundant and superfluous certificate?), in addtion to the aggregation of all the ongoing operations updating the current account balance with subtractions and additions (say issue a brand new stale, static, redundant and superfluous certificate every time there is an account balance update, and then spray it all over the world to every possible and/or potential relying party).

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

SHARE 50 years

From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: SHARE 50 years
Newsgroups: alt.folklore.computers
Date: 23 Jun 2005 06:06:01 -0600

http://www.share.org/

Fifty years of sharing open source

http://business.newsforge.com/business/05/06/15/166253.shtml?tid=35&tid=18

and some comments

http://business.newsforge.com/comments.pl?sid=47512&mode=flat&commentsort=0&op=Change

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

Book on computer architecture for beginners

From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: Book on computer architecture for beginners
Newsgroups: comp.arch,alt.folklore.computers
Date: 23 Jun 2005 06:15:02 -0600
Alex McDonald writes:
And ribbons; ink coated plastic rather than inked fabric.

i tried to search on fabric and film ribbons. i also remember fabric and film ribbon cartridges for 2741 (/selectric) ... film/plastic ribbons were for high proof, final copies. the fabric ribbons were thicker and for multiple passes. the thickness of the fabric resulted in slightly less crisp character edges (very slightly blurred around the edges of the characters). film ribbons produced more crisp character boundaries.

found this listing for nylon ribbons for 1403

http://www.cleansweepsupply.com/pages/item-lex0457937.html

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

SHARE 50 years?

From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: SHARE 50 years?
Newsgroups: bit.listserv.ibm-main
Date: 23 Jun 2005 07:21:19 -0600

http://www.share.org/

Fifty years of sharing open source
http://business.newsforge.com/business/05/06/15/166253.shtml?tid=35&tid=18

and some comments
http://business.newsforge.com/comments.pl?sid=47512&mode=flat&commentsort=0&op=Change

ref to presentation made at fall 68 share meeting
https://www.garlic.com/~lynn/94.html#18

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

Signing and bundling data using certificates

Refed: **, - **, - **, - **
From: lynn@garlic.com
Newsgroups: microsoft.public.dotnet.framework.aspnet.security
Subject: Re: Signing and bundling data using certificates
Date: Thu, 23 Jun 2005 10:24:14 -0700
Alan Fisher wrote:
I am attempting to use a private key from a digital certificate that I have installed on my pc to sign some data (a dime attachment) which is sent to a WebService. The scenario is very similar to the one explained in
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/seccrypto/security/digital_signatures.asp

I wish to use the .Net framework to acheive this. I have looked at the WSE 2.0 sample code and have successfully located the certificate in the store but am now struggling with signing the data.

Can someone point me in the right direction? Sample code would be much appreciated

Also, how does the destination user actually get the public key?


the technology is asymmetric key cryptography .... there are a pair of keys ... and what one key encodes the other key decodes. This is different from symmetric key cryptography where the same key is used for both encoding and decoding.

there is a business process called public/private key ... where one key is made public (public key) and the other key is kept confidentential and is never divulged (private key).

there is an additional business process called
digital signature

authentication ... where a hash of some data is made and then encoded with private key. the corresponding public key can be used to decode the digital signature ... and then compare the decoded digital signature with a recomputed hash of the message. If the recomputed hash and the decoded digital signature are the same, then the recipient knows that 1) the message hasn't been modified and 2) authenticates the originator of the message.

in standard business practice ... somebody registers their public key with destinations and/or relying parties ... in much the same way they might register a pin, password, SSN#, mother's maiden name, birth date, and/or any other authentication information. The advantage of registering a public key over some sort of static, shared-secret ... is that a public key can only be used to authenticate digital signatures .... it can't be used for impersonation (as part of generating a digital signature).
https://www.garlic.com/~lynn/subpubkey.html#certless

On-file, static, shared-secret authentication information can not only be used for authentication ... but also impersonation.
https://www.garlic.com/~lynn/subintegrity.html#secrets

Digital certificates are a business process that addresses an offline email scenario from the early 80s ... where the recipient dials up their local (electronic) post office, exchanges email, hangs up ... and then is possibly faced with authenticating some first time communication from a total stranger (and had no recourse to either local information and/or online information for obtaining the necessary information). It is somewhat analogous to the "letters of credit" used in the sailing ship days.

A trusted party "binds" the public key with some other information into a "digital certificate" and then digital signs the package called a digital certificate. The definition of a "trusted party" is that recipients have the public key of the "tursted party" is some local trusted public key repository (for instance browsers are shipped with a list of trusted party public keys in an internal trusted public key repository).

The originator creates a message or document of some sort, digital signs the information and then packages up the 1) document, 2) digital signature, and 3) digital certificate (containing some binding of their public key to other information)

and transmits it.

The recipient/relying-party eventually gets the package composed of the three pices. The recipient looks up the trusted party's public key in their trusted public key repository, and validates the digital signature on the enclosed digital certificate. If the digital certificate is valid, they then check the "bound" information in the digital certificate to see if it relates to anything at all they are interested in. If so, then they can take the sender's public key (included in the digital certificate) and validate the digital signature on the message. If that all seems to be valid ... they then make certain assumptions about the content of the actual message.

In normal business operations ... where there is prior relationship between the sender and the receiver ... the receiver will tend to already have authentication information about the sender in a local trusted (public key) repository (and not have to resort to trust redirection thru the use of trusted party public keys and digital certificates).

Another scenario is that in the early 90s, there were x.509 identity digital certificates where the trusted parties (or certification authorities ... i.e. CAs) were looking at grossly overloading the "bound" information in the digital certificates with enormous amounts of personal information. This was in part because the CAs didn't have a good idea what future relying parties might need in the way of information about individuals that they were communicating with.

You started to see some retrenchment of this in the mid-90s ... where institutions were started to realize that x.509 identity digital certificates grossly overloaded with personal information represented significant privacy and liability issues. Somewhat as a resort there was some retrenchment to relying-party-only certificates
https://www.garlic.com/~lynn/subpubkey.html#rpo

which contained little more information than the individual's public key and some sort of account number or other database index. The actual database contained the real information. However, it is trivial to show that such replying-party-only certificates not only violate the original purpose of digital certificates, but are also redundant and superfluous ... aka the relying party registers the indivuals public key in their trusted repository along with all of the individual's other information. Since all of the individual's information (including their public key) is already in a trusted repository at the relying party, having an individual repeatedly transmit a digital certificate containing a small, stale, static, subset of the same information is redundant and superfluous.

In some of the scenarios involving relying-party-only certificates from the mid-90s it was even worse than redundant and superfluous. One of the scenarios involved specification for digitally signed payment transactions with appended relying-party-only digital certificate. Typical payment transactions are on the order of 60-80 bytes. The typical erlying-party-only digital certicates involved 4k-12k bytes. Not only were the relying-party-only stale, static digital certificates, redundant and superfluous, they also would represent a factor of one hundred times payload bloat for the payment transaction network (increasing the size of payment transaction by one hundred times for redundant and superfluous stale, static information)

derive key from password

Refed: **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Newsgroups: sci.crypt.research
Subject: Re: derive key from password
Date: 24 Jun 2005 10:07:32 -0600
machiel@braindamage.nl (machiel) writes:
If one derives a key from a password (among other elements) what is accepted cryptographic procedure in terms of security?

1. Can we re-use this key more than once for ciphering data without the risk of extraction/deduction of the password by comparing the various resulting ciphered data sets?

2. Should we add some random element (salt) to the password and derive a new key again every time we cipher data? This way it would be harder (impossible?) to deduce the password from the set of ciphered data.

Of course, when using a cryptographic hardware module, option 2 might be more time consuming (one has to derive the key every time before ciphering data) which is why it might be important to forget about the random salt part and just use the same key again and again. Any help will be appreciated. (Answers can be sent to my mail adress as well.)


some posts mentioning financial industry standard ... derived unique key per transaction (dukpt)
https://www.garlic.com/~lynn/aadsm3.htm#cstech8 cardtech/securetech & CA PKI
https://www.garlic.com/~lynn/aepay10.htm#33 pk-init draft (not yet a RFC)
https://www.garlic.com/~lynn/aadsm13.htm#22 Encryption of data in smart cards
https://www.garlic.com/~lynn/aadsm13.htm#24 Encryption of data in smart cards
https://www.garlic.com/~lynn/aadsm18.htm#53 ATM machine security
https://www.garlic.com/~lynn/2002e.html#18 Opinion on smartcard security requested
https://www.garlic.com/~lynn/2002f.html#22 Biometric Encryption: the solution for network intruders?
https://www.garlic.com/~lynn/2003g.html#9 Determining Key Exchange Frequency?
https://www.garlic.com/~lynn/2003g.html#42 What is the best strongest encryption
https://www.garlic.com/~lynn/2003o.html#12 Database design and confidential data protection
https://www.garlic.com/~lynn/2003o.html#18 Database design and confidential data protection
https://www.garlic.com/~lynn/2003o.html#46 What 'NSA'?
https://www.garlic.com/~lynn/2004c.html#56 Bushwah and shrubbery
https://www.garlic.com/~lynn/2004f.html#9 racf
https://www.garlic.com/~lynn/2005k.html#23 More on garbage
https://www.garlic.com/~lynn/aadsm19.htm#36 expanding a password into many keys
https://www.garlic.com/~lynn/aadsm19.htm#37 expanding a password into many keys

there is also some work on longer term derived key material ... where rather than doing a unique derived key per transaction ... there are long term derived keys. in the DUKPT case, clear-text information from the transaction is part of the process deriving the key. The longer term derived keys tend to use some sort of account number. You might find such implementations in transit systems. There is a master key for the whole infrastructure ... and each transit token then has a unique account number with an associated derived key. The transit system may store data in each token using the token-specific derived key. brute force on a token specific key ... doesn't put the whole infrastructure at risk.

some discussion of attack against RFC 2289 one-time password system (uses iterative hashes of a passphrase)
https://www.garlic.com/~lynn/2003n.html#1 public key vs passwd authentication?
https://www.garlic.com/~lynn/2003n.html#2 public key vs passwd authentication?
https://www.garlic.com/~lynn/2003n.html#3 public key vs passwd authentication?
https://www.garlic.com/~lynn/2005i.html#50 XOR passphrase with a constant

from my rfc index
https://www.garlic.com/~lynn/rfcietff.htm

https://www.garlic.com/~lynn/rfcidx5.htm#1760
1760 I
The S/KEY One-Time Password System, Haller N., 1995/02/15 (12pp) (.txt=31124) (Refs 1320, 1704) (Ref'ed By 1938, 2222, 2229, 2289, 2945, 4082)


in the RFC summary, clicking on the ".txt=nnnn" field retrieves the actual RFC.

https://www.garlic.com/~lynn/rfcidx6.htm#1938
1938 -
A One-Time Password System, Haller N., Metz C., 1996/05/14 (18pp) (.txt=44844) (Obsoleted by 2289) (Refs 1320, 1321, 1704, 1760) (Ref'ed By 2243, 2284, 2828)



https://www.garlic.com/~lynn/rfcidx7.htm#2289
2289 S
A One-Time Password System, Haller N., Metz C., Nesser P., Straw M., 1998/02/26 (25pp) (.txt=56495) (STD-61) (Obsoletes 1938) (Refs 1320, 1321, 1704, 1760, 1825, 1826, 1827) (Ref'ed By 2444, 2808, 3552, 3631, 3748, 3888) (ONE-PASS)


--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

Jack Kilby dead

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: Jack Kilby dead
Newsgroups: alt.folklore.computers
Date: 24 Jun 2005 12:28:18 -0600
hancock4 writes:
The big challenge was developing ways to manufacture such circuits--reliably and in quantity. Making a few on a lab bench isn't the same thing.

The IBM S/360 history discusses in detail how hard it was to develop production lines for its SLT technology. IBM's engineers were criticized for choosing SLT since it wasn't quite as advanced as other IC designs; but at the time that was the most practical usage.

I'm amazed at how they 'draw' resistors, transistors, and capacitors on an SLT chip of the 1960s. For the life of me I can't figure out how digital cameras can store many megs of information on half-inch removable chips.


there were frequently huge yield/volume issues ... i've observed some number of product sectors that had successful vendors ... that were essentially ignored because of yield/volume and/or margin issues.

i've also seen what i consider more science, technology and engineering go into the manufacturing of product ... than might goe into the product itself. literature tends to have lots of stuff about various product technologies ... but not necessarily a whole lot about product manufacturing technology.

i once worked on a product in the mid-70s that was canceled (before announce) because it only showed $9B revenue over five years and was below the minimum threshold requirement of $10b over five years.

...

or have multi-megapixel CCDs. in the mid-80s, I got asked to spend time on what was then called Berkeley 10m (now called Keck 10m .. and they have built a second one). at the time, as part of the effort, they were testing 200x200 (40k pixels) ccd array at lick observatory ... and some talk about maybe being able to get a 400x400 (160k pixels) for testing. there was an industry rumor that possibly spielberg might have a 2kx2k ccd that he was testing.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

who invented CONFIG/SYS?

Refed: **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: who invented CONFIG/SYS?
Newsgroups: comp.sys.tandy,alt.folklore.computers
Date: 24 Jun 2005 14:26:49 -0600
noone writes:
In any event, there is (I believe) little to "discuss" from Durda's reply. It says what is known history, that MS-DOS was derived (in some fashion, there is some controversy about how) from CP/M, CP/M's syntax was inspired by various DEC operating systems, and those in turn were inspired by earlier OS's. Consequently, the original subject of this thread - about the "invention" of when two common words of computing use became combined - is simply a happenstance, not signifigant by itself. A Web search will find this history, or discussions of it.

as well as a little cp67/cms ... including possibly the cp/m name ... a couple past posts referencing the subject:
https://www.garlic.com/~lynn/2004b.html#5 small bit of cp/m & cp/67 trivia from alt.folklore.computers n.g. (thread)
https://www.garlic.com/~lynn/2004e.html#38 [REALLY OT!] Overuse of symbolic constants
https://www.garlic.com/~lynn/2004h.html#40 Which Monitor Would You Pick??????

for some tandy topic drift, the original document formater/processor done on cms at the science center
https://www.garlic.com/~lynn/subtopic.html#545tech

was called script and used runoff-like "dot" commands. in '69, "G", "M", and "L" invented GML at the science center
https://www.garlic.com/~lynn/submain.html#sgml

(and of course they had to come up with a name that matched their initials). it was later standardized in iso as SGML and later begat HTML, XML, FSML, SAML, etc.

univ. of waterloo did a cms script clone that was in use on cms at cern ... and is the evolutionary path to html .. recent reference in afc to the UofW and cern connection:
https://www.garlic.com/~lynn/2005k.html#58 Book on computer architecture for beginners

an IBM SE (system engineer) out of the LA branch office, in the late 70s, did a cms script clone and sold it on tandy machines (some of the references seem to indicate it is still available):
https://www.garlic.com/~lynn/2000e.html#0 What good and old text formatter are there ?
https://www.garlic.com/~lynn/2000e.html#20 Is Al Gore The Father of the Internet?^
https://www.garlic.com/~lynn/2002b.html#46 ... the need for a Museum of Computer Software
https://www.garlic.com/~lynn/2002h.html#73 Where did text file line ending characters begin?
https://www.garlic.com/~lynn/2002p.html#54 Newbie: Two quesions about mainframes
https://www.garlic.com/~lynn/2003.html#40 InfiniBand Group Sharply, Evenly Divided
https://www.garlic.com/~lynn/2004o.html#5 Integer types for 128-bit addressing
https://www.garlic.com/~lynn/2005.html#46 8086 memory space

for total topic drift ... current w3c offices are only a couple blocks from the old science center location at 545 tech. sq.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

Refed: **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 25 Jun 2005 07:48:28 -0600
"Anders Rundgren" writes:
Probably because this is the only thing that is needed. If you need additional information concerning a certified identity, you will in most cases have to ask another party for that.

there are enormous examples of where real time and aggregated information is viewed as advantagious when making a decision ... especially where value is concerned. The simple issue with OCSP is that it needed to preserve the facade that stale, static information was useful all by itself. The original statement was that anybody making any decision with regard to things of value ... if all other things were equal ... and they had a choice between

1) stale, static, year old information (say about whether a financial account may or may not have existed)

and

2) real-time response based on real-time and aggregated information whether they were being paid.

... would relying parties prefer to have stale, static year old information ... or would they prefer to have a real time answer whether or not they were being paid.

The issue is that OCSP goes to all the trouble to have a real-time information responding yes/no to whether the stale, static information was still current ... but doesn't provide a yes/no response to whether the relying party was actually being paid.

The contention is that going to all the trouble of having a real-time operation ... the yes/no response to being paid or not ... is of significantly more value to a relying party than whether or not some stale, static information was still valid.

the analogy is that you have a strip mall ... that has a bunch of retail stores. there are appliance operation, a dry goods operation and an identification operation. you go into the appliance operation and buy a appliance and present a card ... that card initiates an online transaction, which checks you financial worth and recent transactions and a relying party returns to the merchant a guarantee that they will be (and possibly already have been) paid.

you then go into the identity operation and present a card ... the digital certificate is retrieved by the operation ... it does an OCSP to check if the certificate is still valid and then they verify a digital signature operation. then you walk out of the store (random acts of gratuitous identification).

the issue, of course, is that very few verification or identification things are done just for the sake of doing them .. they are almost always done within the context of performing some other operation. the assertion has always been that the verification of stale, static information is only useful to the relying party whent they have no recourse to more valuable, real-time information (and/or recent stale, staic paradigm has tried to move into the no-value market niche, where the no-value operation can't justify the cost of real-time operation)

you very seldom have acts of gratuitous identification occurring ... they are occurring within some context. furthermore there are huge number of operations where the issue of identification is superfluous to the objective of the operation ... which may be primarily the exchange of value (as the object of the operation) and identification is truely redundant and superfluous (as can be demonstrated when anonomous cash can be used in lieu of financial institutional based exchange of value).

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 25 Jun 2005 08:43:24 -0600
"Anders Rundgren" writes:
3D Secure (a.k.a. VbV) is an interesting twist to this as it really (under the user's "supervision") connects the merchant and the card- holder's bank for getting as fresh information there can probably be. Also relying on PKI. Scales incredible well as you only need one cert per bank and CC brand.

Are you saying that the PKI scales or the infrastructure scales?

It would appear to be a descaling of PKI ... since there is only "one cert per bank".

It is also has some number of operations that could be considered antithetical to the PKI design point. The consumer bank and the consumer have a predefined relationship. It is possible for the consumer bank to ship their public key for direct installation in the consumer's trusted public key repository.

The PKI design point has trusted third party CAs ... installing their public key in the consumer's trusted public key repository ... the original model from the original electronic commerce
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3

with this thing called SSL.

The CA's then digitally signed digital certificates for operations that the consumer had no prior relationship with. The consumer could validate the digital certificates with the "CA" public keys on-file in their local trusted public key repository (possibly manufactured and integrated into their application, like a browser).

For predefined relationship between a consumer and their financial institution ... they can exchange public keys and store them in their respective trusted public key repositories (a financial institution can provide the consumer some automation that assists in such an operation).

PKI would appear to actually make the existing infrastructure less secure ... rather than the consumer directly trusting their financial institution ... the consumer would rely on a TTP CA to provide all their trust about their own consumer financial institution.

In the mid-90s there was work done on PKI for payment transactions. One of the things learned from the early 90s, x.509 identity certificates ... was that they appeared to represent significant privacy and liability issues. As a result many institutions retrenched to relying-party-only digital certificates ...
https://www.garlic.com/~lynn/subpubkey.html#rpo

that contained little more than some sort of database lookup value (like an account number) and a public key. however, it was trivial to demonstrate such certificates were redundant and superfluous. One possible reason for the ease in demonstrating that such stale, static certificates were redundant and superfluous was that they appeared to totally violate the basis PKI design point, aka requiring a independent, trusted third party to establish trust between two entities that never previously had any interaction.

For two parties that have pre-existing relationship, it is possible for them to directly exchange public keys and store them in their respective trusted public key repositories .... and not have to rely on a trusted third party to tell them whether they should trust each other. In the case where a consumer's financial institution is the only entity with a public/private key pair ... it is possible for the consumer to obtain the public key of their trusted financial institution ... not needed to rely on some independent third party to provide them with trust.

The other issue from the mid-90s in the PKI-oriented payment transaction specification ... was that besides using redundant and superfluous stale, static certificates ... they also represented enormous payload bloat for the financial infrastructure. The typical iso 8583 payment transaction is on the order of 60-80 bytes. The RPO-certificate overhead for the operation was on the order of 4k to 12k bytes. The stale, static, redundant and superfluous digital certificate overhead represented an enormous, one hundred times increase in payload bloat.

Another question ... are you saying that the complete transaction goes via this new path ... or does the existing real-time iso 8583 transaction have to be performed in addition to this new real-time function (also being performed, at least doubling the number and overhead for real-time operations).

The existing iso 8583 operations goes as straight-through processing in a single real-time round trip. Does the introduction of these new operations improve on the efficiency of that existing single round-trip, straight through processing?

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 25 Jun 2005 09:11:27 -0600
with respect to iso 8583 payment network trust and single round-trip, straight thru processing.

part of the issue is that a PKI is redundant and superfluous since they don't need to rely on a trusted third party to provide trust between anonomous strangers that have never before met. in some sense, the pre-existing relationship and pre-existing trust allows for more efficient, single round-trip, straight through processing ... w/o having to go through a trust discovery process for every transactions (authentication should be sufficient).

in the normal operation, a merchant financial institution has a contractual relationship with merchants ... for which the merchant financial institution also takes some amount of financial liability. one of the well-used examples is the airline industry, both loved and somewhat feared by merchant financial institutions. There are a lot of high value transactions ... but there is also the prospect of the airline going bankrupt ... in the past this has represented something like $20m (or more) in outstanding airline tickets that the merchant financial institution had to make good on.

In a manner similar to the merchant financial institution and the merchant, there is also a pre-existing contractual relationship between a consumer and the consumer's financial institution (with the consumer's financail institution also accepting liability for their consumers). Again, no trusted third party PKI is required to establish trust on every operation that goes on between the consumer and the consumer's financial institution.

Over both the merchant financial institutions and the consumer financial institutions are the associations ... where there are pre-existing contractual relationships between the associations and the financial institutions. Again, there is no requirement for a trusted third party PKI to provide for a trust relationship on every transaction between the financial institutions and the associations.

A trusted third party PKI has no role in such an environment because there are pre-existing contractual, trust relationships already in place ... making a trusted third party PKI redundant and superfluous.

So not only is there an end-to-end contractual trust chain that follow from the merchant, to the merchant financial institution, to the associations, to the consumer financial institution, to the consumer ... this pre-existing end-to-end contractual trust chain can be relied upon to improve efficiency so that the whole trust establishment processes doesn't have to be re-executed on every transaction ... allowing for single round-trip straight through processing.

The existing issue ... doesn't have so much to do with establishing trust relationships (the objective of TTP CA & PKIs) but the simple problem of improving the authentication technology (when trust has already been established ... then the operations can rely on simpler authentication events ... rather than having to repeatedly re-establish the basis for identification and trust) ... it has to do with the vulnerability and exploits associated with existing authentication technology in use.

It is possible to simply improve on the integrity of the authentication technology .... w/o having to introduced the complexity and expense of repeatedly having to re-establish trust for every operation.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

Refed: **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 25 Jun 2005 09:34:21 -0600
Anne & Lynn Wheeler writes:
Another question ... are you saying that the complete transaction goes via this new path ... or does the existing real-time iso 8583 transaction have to be performed in addition to this new real-time function have to also be performed (at least doubling the number and overhead for real-time operations).

ref: post on established, pre-existing trust relationships making using a trusted third party PKI CA trust establishment, redudnant and superfluous (aka in addition to issue of drastically inflating the processing overhead on a per transaction basis if the requirement existed to re-establish all trust on a per transaction basis):
https://www.garlic.com/~lynn/2005l.html#13 The Worth of Verisign's Brand

one of the reasons that it is unlikely that the real transaction will go via the new path directly from the merchant to the consumer's financial institution (except in the "on-us" scenario where the same financial institution represents both the consumer and the merchant) is that the merchant financial institution has interest in real-time tracking of the merchant activities (aka the merchant financial institution is liable for what the merchant does, in much the same that the consumer financial institution is liable for consumer transactions).

having the merchant substitute direct transaction with the consumer financial institution would cut the merchant financial institution out of the single round-trip, straight-through process path. this might be likely were the consumer financial institution to not only assume liability for the consumer but also for the merchant (as in the "on-us" transaction scenario, aka "on-us" is defined as the situation where the same financial institution represents both the merchant and the consumer in the transaction).

on going posts in this thread:
https://www.garlic.com/~lynn/2005i.html#12 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#13 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#14 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#17 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#21 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#23 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#24 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#26 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005k.html#60 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005l.html#1 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005l.html#2 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005l.html#3 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005l.html#11 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005l.html#12 The Worth of Verisign's Brand

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 25 Jun 2005 09:51:51 -0600
oh, and just for the fun of it ... past threads where we've exchanged posts regarding the nature of

1) pre-existing, established contractual trust relationships

vis-a-vis

2) dynamically establishment of trust on every transaction with the aid of trusted third party PKI certification authority

and in the case of pre-existing, contractual trust relationships, whether or not PKI certification authorities were redundant and superfluous for also establishment of trust relationship
https://www.garlic.com/~lynn/aepay11.htm#70 Confusing Authentication and Identiification? (addenda)
https://www.garlic.com/~lynn/aepay12.htm#1 Confusing business process, payment, authentication and identification
https://www.garlic.com/~lynn/aadsm12.htm#22 draft-ietf-pkix-warranty-ext-01
https://www.garlic.com/~lynn/aadsm12.htm#41 I-D ACTION:draft-ietf-pkix-sim-00.txt
https://www.garlic.com/~lynn/aadsm12.htm#45 draft-ietf-pkix-warranty-extn-01.txt
https://www.garlic.com/~lynn/aadsm12.htm#48 draft-ietf-pkix-warranty-extn-01.txt
https://www.garlic.com/~lynn/aadsm12.htm#54 TTPs & AADS Was: First Data Unit Says It's Untangling Authentication
https://www.garlic.com/~lynn/aadsm17.htm#9 Setting X.509 Policy Data in IE, IIS, Outlook

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

Newsgroups (Was Another OS/390 to z/OS 1.4 migration

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: Newsgroups (Was Another OS/390 to z/OS 1.4 migration
Newsgroups: bit.listserv.ibm-main,alt.folklore.computers
Date: 25 Jun 2005 13:12:34 -0600
Eric-PHMining@ibm-main.lst (Eric Bielefeld) writes:
A couple of years ago, I was getting so much spam that I decided to change my work email address. The lady in charge of email at the time told me I got more email than all but 2 or 3 people each month at our company worldwide. We employ I'd guess about 5,000 people. She strongly recommended that I not subscribe to IBM-Main via email. They made a special rule for me to allow me to read newsgroups. There may be more now, but back then I was the only one. The only problem with that is that I can't reply via the newsgroup. I have to access the web site, find the message, which sometimes takes a while, and then reply. At home, like I am now, I can just reply.

at one point in the early 80s ... there were periodic claims that for some months, i was in some way responsible for 20 to 30 percent of all bits flowing across the worldwide internal network (there were bits other then email ... like i was also blamed for being the internal network infection vector for the adventure game distribution).

the internal network was larger than the arpanet/internet from just about the start up until around summer of '85.
https://www.garlic.com/~lynn/subnetwork.html#internalnet

at the great switch over from host/imp arpanet to internetworking protocol on 1/1/83 ... the arpanet was around 250 nodes. by comparison, not too long afterwards, the internal network passed 1000 nodes:
https://www.garlic.com/~lynn/internet.htm#22

i've claimed that one of the possible reasons was that the major internal networking nodes had a form of gateway functionality built into every node ... which the arpanet/internet didn't get until the 1/1/83 cut-over to internetworking protocol.

during this period in the early 80s ... there was some growing internal anxiety about this emerging internal networking prevalence ... that had largely grown up from the grassroots.

there were all kinds of efforts formed to try and study and understand what was happening. once such effort even brought in hiltz and turoff (the network nation) to help study what was going on.

also, there was a researcher assigned to set in the back of my office. they took notes on how i communicated in face-to-face (also going to meetings with me), on the phone ... and they also had access to contents of all my incoming and outgoing email as well as logs of all my instant messages. this went on for 9 months ... the report also turned into a stanford phd thesis (joint with language and computer AI) ... as well as material for subsequent papers and books. some references included in collection of postings on computer mediated communication
https://www.garlic.com/~lynn/subnetwork.html#cmc

one of the stats was that supposedly for the 9 month period, that I exchanged email with an avg. of 275-some people per week (well before the days of spam).

later in the 80s ... there was the nsf network backbone RFP. we weren't allowed to bid ... but we got an nsf study that reported that the backbone that we were operating was at least five years ahead of all bid submissions to build the nsfnet backbone.
https://www.garlic.com/~lynn/internet.htm#0
https://www.garlic.com/~lynn/subnetwork.html#hsdt

the nsf network backbone could be considered the progenitor of the modern internet .... actually deploying a backbone for supporting network of networks (aka an operational characteristic that goes along with the internetworking protocol technology).

note that into the early and mid-90s ... much of the newsgroups were still riding the usenet/uucp rails (not yet having moved to internet in any significant way) ... and people having either a direct usenet feed or having access via some BBS that had a usenet feed. Circa 93, I co-authored an article for boardwatch (bbs industry mag) about drivers I had done for a full usenet satellite broadcast feed.

the "ISPs" of this era were typically offering shell accounts and/or UUCP accounts (predating PPP and tcp/ip connectivity).

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

Refed: **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 25 Jun 2005 13:35:26 -0600
"Anders Rundgren" writes:
I don't believe in that model anymore. 3D offers so much more possibilities for integration in purchasing systems which the classic model cannot do. Neither can AADS. It is like "federation" for payments.

see later question about whether 3d is doubling the number of online transactions ... and also possibly attempting to force fit a trusted third party CA PKI business model (for providing trust between two entities that have had no prior interaction and/or communication) as redundant and superfluous business operation where there already exists contractual existing relationship.
https://www.garlic.com/~lynn/2005l.html#12
https://www.garlic.com/~lynn/2005l.html#13
https://www.garlic.com/~lynn/2005l.html#14

as mentioned ... it is unlikely that 3D (going directly from the merchant to the consumer financial institution) is actually replacing the existing payment message transport ... unless it is actually suggesting that the merchant financial institution is no longer involved representing the merchant ... and that the consumer financial institutions will be assuming all liability responsiblity for the merchant.

futhermore if you study the existing infrastructure ... not only does the federation of payments already exist ... but there are long term contractual trust vehicles in place that support that support that federaion of payments (between merchant, merchant financial instituation, association, consumer financial institution, and consumer).

if it isn't replacing the existing real-time, online, single round-trip, straight-through processing ... that directly involves all the financially responsible parties ... then presumably it is just adding a second, online, real-time transaction to an existing online, real-time transaction? (doubling the transaction and processing overhead).

one of the things that kindergartern, security 101 usually teaches is that if you bifurcate transaction operation in such a way ... you may be opening up unnecessary security and fraud exposures ... in addition to possibly doubling the transaction and processing overhead.

now, the design point for the stale, static, PKI model was for establishing trust for a relying party that had no other recourse about first time communication with a party where no previous relationship existed. Supposedly 3d (assuming that it is just adding a second realtime, online transaction to an already existing, realtime online transaction) is doubling the number and overhead of online, realtime transactions .... in addition to managing to craft in some stale, static PKI processing.

the AADS model doesn't do anything about federation or non-federation of payments. AADS simply provides for providing improved authentication technology integrated with standard business operations:
https://www.garlic.com/~lynn/x959.html#aads

There have been some significant protocols defined over the past several years ... where authentication was done as an independent operation ... totally separate from doing authentication on the transaction itself. In all such cases that I know of, it has been possible to demonstrated man-in-the-middle (MITM) attacks
https://www.garlic.com/~lynn/subintegrity.html#mitm

where authentication is done separately from the actual transaction.

in the mid-90s the x9a10 financial standards working group was tasked with preserving the integrity of the financial infrastructure for all retail payments ... and came up with x9.59
https://www.garlic.com/~lynn/x959.html#x959
https://www.garlic.com/~lynn/subpubkey.html#privacy

which simply states that transaction is directly authenticated. some recent posts (in totally different thread) going into some number of infrastructure vulnerabilities and the x9.59 financial standard countermeasures:
https://www.garlic.com/~lynn/aadsm19.htm#17 What happened with the session fixation bug?
https://www.garlic.com/~lynn/aadsm19.htm#32 Using Corporate Logos to Beat ID Theft
https://www.garlic.com/~lynn/aadsm19.htm#38 massive data theft at MasterCard processor
https://www.garlic.com/~lynn/aadsm19.htm#39 massive data theft at MasterCard processor
https://www.garlic.com/~lynn/aadsm19.htm#40 massive data theft at MasterCard processor
https://www.garlic.com/~lynn/aadsm19.htm#44 massive data theft at MasterCard processor

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

Refed: **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 25 Jun 2005 14:23:43 -0600
"Anders Rundgren" writes:
I don't believe in that model anymore. 3D offers so much more possibilities for integration in purchasing systems which the classic model cannot do. Neither can AADS. It is like "federation" for payments.

one of the things that 3D appears to offer is keeping the original real-time, online transaction, adding a second real-time online transaction ... IN ADDITION to throwing in redundant and superfluous PKI operations; the original PKI design point was to provide a trust solution for a relying party typically in an offline environment, where the relying party had no other trust recourse involving the other party (having no prior communication and/or prior relationship)

there has been some threads about having defense-in-depth.

the counter argument to defense-in-depth ... is a lot of the defense-in-depth strategies drastically increase the complexity of the infrastructure ... and frequently, it is complexity itself that opens up vulnerabilities and exploits.

the countermeasure to complexity vulnerabilities and exploits frequently is KISS ... where simpler actually wins out over defense-iu-depth and more complex. In part, defense-in-depth, while possibly creating overlapping layers ... frequently also creates cracks between such layers that allow the crooks to slip through.

a couple past threads mentioning defense-in-depth
https://www.garlic.com/~lynn/aepay11.htm#0 identity, fingerprint, from comp.risks
https://www.garlic.com/~lynn/2002j.html#40 Beginner question on Security
https://www.garlic.com/~lynn/aadsm19.htm#27 Citibank discloses private information to improve security
https://www.garlic.com/~lynn/2005b.html#45 [Lit.] Buffer overruns

numerous past posts mentioning KISS:
https://www.garlic.com/~lynn/aadsm2.htm#mcomfort Human Nature
https://www.garlic.com/~lynn/aadsm3.htm#kiss1 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss2 Common misconceptions, was Re: KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp-00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss3 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss4 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss5 Common misconceptions, was Re: KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss6 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss7 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss8 KISS for PKIX
https://www.garlic.com/~lynn/aadsm3.htm#kiss9 KISS for PKIX .... password/digital signature
https://www.garlic.com/~lynn/aadsm3.htm#kiss10 KISS for PKIX. (authentication/authorization seperation)
https://www.garlic.com/~lynn/aadsm5.htm#liex509 Lie in X.BlaBla...
https://www.garlic.com/~lynn/aadsm7.htm#3dsecure 3D Secure Vulnerabilities?
https://www.garlic.com/~lynn/aadsm8.htm#softpki10 Software for PKI
https://www.garlic.com/~lynn/aepay3.htm#gaping gaping holes in security
https://www.garlic.com/~lynn/aepay7.htm#nonrep3 non-repudiation, was Re: crypto flaw in secure mail standards
https://www.garlic.com/~lynn/aepay7.htm#3dsecure4 3D Secure Vulnerabilities? Photo ID's and Payment Infrastructure
https://www.garlic.com/~lynn/aadsm10.htm#boyd AN AGILITY-BASED OODA MODEL FOR THE e-COMMERCE/e-BUSINESS ENTERPRISE
https://www.garlic.com/~lynn/aadsm11.htm#10 Federated Identity Management: Sorting out the possibilities
https://www.garlic.com/~lynn/aadsm11.htm#30 Proposal: A replacement for 3D Secure
https://www.garlic.com/~lynn/aadsm12.htm#19 TCPA not virtualizable during ownership change (Re: Overcoming the potential downside of TCPA)
https://www.garlic.com/~lynn/aadsm12.htm#54 TTPs & AADS Was: First Data Unit Says It's Untangling Authentication
https://www.garlic.com/~lynn/aadsm13.htm#16 A challenge
https://www.garlic.com/~lynn/aadsm13.htm#20 surrogate/agent addenda (long)
https://www.garlic.com/~lynn/aadsm15.htm#19 Simple SSL/TLS - Some Questions
https://www.garlic.com/~lynn/aadsm15.htm#20 Simple SSL/TLS - Some Questions
https://www.garlic.com/~lynn/aadsm15.htm#21 Simple SSL/TLS - Some Questions
https://www.garlic.com/~lynn/aadsm15.htm#39 FAQ: e-Signatures and Payments
https://www.garlic.com/~lynn/aadsm15.htm#40 FAQ: e-Signatures and Payments
https://www.garlic.com/~lynn/aadsm16.htm#1 FAQ: e-Signatures and Payments
https://www.garlic.com/~lynn/aadsm16.htm#10 Difference between TCPA-Hardware and a smart card (was: example:secure computing kernel needed)
https://www.garlic.com/~lynn/aadsm16.htm#12 Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)
https://www.garlic.com/~lynn/aadsm17.htm#0 Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)<
https://www.garlic.com/~lynn/aadsm17.htm#41 Yahoo releases internet standard draft for using DNS as public key server
https://www.garlic.com/~lynn/aadsm17.htm#60 Using crypto against Phishing, Spoofing and Spamming
https://www.garlic.com/~lynn/aadsmail.htm#comfort AADS & X9.59 performance and algorithm key sizes
https://www.garlic.com/~lynn/aepay10.htm#76 Invisible Ink, E-signatures slow to broadly catch on (addenda)
https://www.garlic.com/~lynn/aepay10.htm#77 Invisible Ink, E-signatures slow to broadly catch on (addenda)
https://www.garlic.com/~lynn/aepay11.htm#73 Account Numbers. Was: Confusing Authentication and Identiification? (addenda)
https://www.garlic.com/~lynn/99.html#228 Attacks on a PKI
https://www.garlic.com/~lynn/2001.html#18 Disk caching and file systems. Disk history...people forget
https://www.garlic.com/~lynn/2001l.html#1 Why is UNIX semi-immune to viral infection?
https://www.garlic.com/~lynn/2001l.html#3 SUNW at $8 good buy?
https://www.garlic.com/~lynn/2002b.html#22 Infiniband's impact was Re: Intel's 64-bit strategy
https://www.garlic.com/~lynn/2002b.html#44 PDP-10 Archive migration plan
https://www.garlic.com/~lynn/2002b.html#59 Computer Naming Conventions
https://www.garlic.com/~lynn/2002c.html#15 Opinion on smartcard security requested
https://www.garlic.com/~lynn/2002d.html#0 VAX, M68K complex instructions (was Re: Did Intel Bite Off MoreThan It Can Chew?)
https://www.garlic.com/~lynn/2002d.html#1 OS Workloads : Interactive etc
https://www.garlic.com/~lynn/2002e.html#26 Crazy idea: has it been done?
https://www.garlic.com/~lynn/2002e.html#29 Crazy idea: has it been done?
https://www.garlic.com/~lynn/2002i.html#62 subjective Q. - what's the most secure OS?
https://www.garlic.com/~lynn/2002k.html#11 Serious vulnerablity in several common SSL implementations?
https://www.garlic.com/~lynn/2002k.html#43 how to build tamper-proof unix server?
https://www.garlic.com/~lynn/2002k.html#44 how to build tamper-proof unix server?
https://www.garlic.com/~lynn/2002m.html#20 A new e-commerce security proposal
https://www.garlic.com/~lynn/2002m.html#27 Root certificate definition
https://www.garlic.com/~lynn/2002p.html#23 Cost of computing in 1958?
https://www.garlic.com/~lynn/2003b.html#45 hyperblock drift, was filesystem structure (long warning)
https://www.garlic.com/~lynn/2003b.html#46 internal network drift (was filesystem structure)
https://www.garlic.com/~lynn/2003c.html#66 FBA suggestion was Re: "average" DASD Blocksize
https://www.garlic.com/~lynn/2003d.html#14 OT: Attaining Perfection
https://www.garlic.com/~lynn/2003h.html#42 IBM says AMD dead in 5yrs ... -- Microsoft Monopoly vs
https://www.garlic.com/~lynn/2003.html#60 MIDAS
https://www.garlic.com/~lynn/2003m.html#33 MAD Programming Language
https://www.garlic.com/~lynn/2003n.html#37 Cray to commercialize Red Storm
https://www.garlic.com/~lynn/2004c.html#26 Moribund TSO/E
https://www.garlic.com/~lynn/2004e.html#26 The attack of the killer mainframes
https://www.garlic.com/~lynn/2004e.html#30 The attack of the killer mainframes
https://www.garlic.com/~lynn/2004f.html#58 Infiniband - practicalities for small clusters
https://www.garlic.com/~lynn/2004f.html#60 Infiniband - practicalities for small clusters
https://www.garlic.com/~lynn/2004g.html#24 |d|i|g|i|t|a|l| questions
https://www.garlic.com/~lynn/2004h.html#51 New Method for Authenticated Public Key Exchange without Digital Certificates
https://www.garlic.com/~lynn/2004q.html#50 [Lit.] Buffer overruns
https://www.garlic.com/~lynn/2005.html#10 The Soul of Barb's New Machine
https://www.garlic.com/~lynn/2005.html#12 The Soul of Barb's New Machine
https://www.garlic.com/~lynn/2005c.html#22 [Lit.] Buffer overruns
https://www.garlic.com/~lynn/aadsm19.htm#27 Citibank discloses private information to improve security
https://www.garlic.com/~lynn/2005i.html#19 Improving Authentication on the Internet

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

Bank of America - On Line Banking *NOT* Secure?

Refed: **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: Bank of America - On Line Banking *NOT* Secure?
Newsgroups: sci.crypt
Date: 25 Jun 2005 15:24:21 -0600
"John E. Hadstate" writes:
You can still connect to https://www.bankofamerica.com/ but they immediately redirect you to http://www.bankofamerica.com/. From a cursory examination of the Javascript, it looks like the login information is submitted using an https connection to a cgi application.

Originally, SSL was supposed to be countermeasure for two vulnerabilities

1) spoofed website and/or MITM attack 2) evesdropping

there have been mechanisms that allow key exchange that don't require certificates & CAs ... that then would allow encrypted sessions as countermeasure for evesdropping.

the SSL domain name certificates were supposed to provide that the domain name that you typed in for the URL ... matched the domain name provided in an SSL domain name certificate (from the server)
https://www.garlic.com/~lynn/subpubkey.html#sslcert
and subsequently leveraged to provide key exchange with the valid end-point and end-to-end encryption.

the problem was that a lot of merchants ... considering the original SSL target for e-commerce
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3

... found that they got something like five-times the thruput using non-SSL. The result is that the merchants avoided using SSL & https for non-evesdropping scenarios ... reserving it solely for evesdropping like operations. in the e-commerce scenario, that typically met the user got to eventually click on a "check-out" or "pay" button, which, in turn invoked SSL for the payment phase.

the problem was that the URL the user provided was never checked against the certificate of the site the user was visiting. So if the user happened to be dealing with a spoofed site ... when they finally got to the "pay" button ... the "pay" button generated a URL (on the user's behalf) and if it happened to be a spoofed site, it was highly likely that the URL that the spoofed site provided as part of the "pay" button, was highly likely to match whatever was in an SSL domain name certificate from the server that the user had been directed to.

the issue is that if there really is a spoofed site vulnerability and that the user might happen to be visiting a spoofed site (which is in large part the justification for SSL, ssl domain name certificates, certification authorities, etc) ... then nothing such a suspect site does or provides should be trusted ... including any javascript or other html related stuff that invokes ssl (as a evesdropping countermeasure) ... since it may also be to another fraudulent site (and they are keeping the other crooks from evesdropping on their spoofed communication).

spoofed site technology can either be straight spoofed site ... its own site with all its own files providing the look & feel of the real site. A spoofed site might also be done as man-in-the-middle attack ... where the spoofed site is actually acting as a middle man between the end-user and the real site ... although possibly subtly modifying communication passing thru:
https://www.garlic.com/~lynn/subintegrity.html#mitm

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

Newsgroups (Was Another OS/390 to z/OS 1.4 migration

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: Newsgroups (Was Another OS/390 to z/OS 1.4 migration
Newsgroups: bit.listserv.ibm-main,alt.folklore.computers
Date: 25 Jun 2005 21:30:29 -0600
David Scheidt writes:
In 1996, I worked for an ISP with such a USENET feed. There were some hardware issues to work out -- there was a serial interface between the satellite receiver and the host computer, which ran at 115Kbps with no flow control, which meant we had to dedicate a machine to just reading the stuff off the wire (And doing some sort of decompression?). A second machine held the news spool and ran innd. Once we got that sorted out, we saw our out bound network usage get pegged (A pair of DS1s, I think). Some investigation showed that we were feeding our upstream news peers. We ended up having to hold articles for an hour before we'd offer them to our peers. Even then, there were some locations for which the best route to UUNET was through us. Shortly thereafter, the satellite feed provider went broke and had their transmitter turned off.

re:
https://www.garlic.com/~lynn/2005l.html#16 Newsgroups (Was Another OS/390 to z/OS 1.4 migration

old usenet newsgroup posting from the feed:


Path: wheeler!pagesat!olivea!hal.com!darkstar.UCSC.EDU!osr
From: vern@daffy.ee.lbl.gov (Vern Paxson)
Newsgroups: comp.os.research
Subject: Paper on wide-area TCP growth trends available for ftp
Date: 13 May 1993 17:52:04 GMT
Lines: 34
Approved: comp-os-research@ftp.cse.ucsc.edu
Message-ID: <1su1s4INNsdj@darkstar.UCSC.EDU>
NNTP-Posting-Host: ftp.cse.ucsc.edu
Originator: osr@ftp
*** EOOH ***
From: vern@daffy.ee.lbl.gov (Vern Paxson)
Newsgroups: comp.os.research
Subject: Paper on wide-area TCP growth trends available for ftp
Date: 13 May 1993 17:52:04 GMT
Originator: osr@ftp

The following paper is now available via anonymous ftp to ftp.ee.lbl.gov.
Retrieve WAN-TCP-growth-trends.ps.Z (about 100KB):

Growth Trends in Wide-Area TCP Connections
  Vern Paxson
Lawrence Berkeley Laboratory and
EECS Division, University of California, Berkeley
vern@ee.lbl.gov

We analyze the growth of a medium-sized research laboratory's
  wide-area TCP connections over a period of more than two years.
Our data consisted of six month-long traces of all TCP connections
made between the site and the rest of the world.  We find that
{\em smtp\/}, {\em ftp\/}, and {\em X11} traffic all exhibited
  exponential growth in the number of connections and bytes
transferred, at rates significantly greater than that at which the
  site's overall computing resources grew; that individual users
increasingly affected the site's traffic profile by making
wide-area connections from background scripts; that the proportion
of local computers participating in wide-area traffic outpaces the
  site's overall growth; that use of the network by individual
computers appears to be constant for some protocols ({\em telnet})
  and growing exponentially for others ({\em ftp\/}, {\em smtp\/});
and that wide-area traffic geography is diverse and dynamic.

If you have trouble printing it let me know and I'll mail you hardcopy.

Vern

Vern Paxson                             vern@ee.lbl.gov
Systems Engineering                     ucbvax!ee.lbl.gov!vern
Lawrence Berkeley Laboratory            (510) 486-7504

backyard full usenet feed
https://www.garlic.com/~lynn/pagesat.jpg

pagesat dish

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 26 Jun 2005 08:30:30 -0600
"Anders Rundgren" writes:
In fact they sometimes do but here you have to hold your horses; this certificate has nothing to do with CCs, it is a login/signature solution for the customer to the bank. This PKI is typically in-house while the 3D secure is CC-branded as otherwise merchants would not recognize CC-branded banks.

so the consumer doesn't need a PKI public key when they are dealing with their own bank ... they could just record a certificate-less public key
https://www.garlic.com/~lynn/subpubkey.html#certless

for their financial institution in their trusted public key store. this also would eliminate many of the bank site spoofing vulnerabilities ... recent discussion
https://www.garlic.com/~lynn/2005l.html#19

in the above ... it discusses various kinds of spoofing and MITM-attacks ... where the end user is provided with a URL ... rather than entering it themselves. Then you have an exploit of SSL ... which is only verifying the domain name in the entered URL against the domain name in the supplied certificate. If you aren't entering the URL ... but it is being provided by an attacker ... then they are likely to provide a URL that corresponds to a certificate that they have valid rights for. This has been a long recognized characteristic.
https://www.garlic.com/~lynn/subpubkey.html#sslcert

A consumer, having vetted a bank's public key for storing in their own trusted public key repository ... then can use that vetted public key for future communication with their financial institution ... and not be subject to vulnerabilities and exploits of an externally provided (certificate-based) public key that has had no vetting .. other than it is a valid public key and belongs to somebody.

The purpose for PKI has been for allowing relying parties to establish some level of trust when dealing with first-time encounters with entities that are otherwise complete strangers ... and the relying party has no other recourse for accessing information to establish trust. The design point was somewhat from the early 80s when there was much lower level of online connectivity and relying parties frequently operated in offline environment.

With the ubiquitous proliferation of the internet, those offline pockets are being drastically reduced. Somewhat as a result, some PKIs have attempted to move into the no-value market segment ... where a relying party is online ... but the value of the operation doesn't justify performing a online transactions. The issue user is that as the internet becomes much more pervasive ... the cost of online internet operations are radically dropping ... which in turn is drastically reducing the no-value situations that can't justify an online operation.

Presumably in the 3d secure PKI scenario, it has a financial institution's CC-specific certificate that is targeted specifically at relying parties that have had no prior dealings with that financial institution(*?*).

Presumably this implies the merchant as a relying party in dealing with the consumer's financial institution (the other alternative is possibly the consumer as a relying party in dealing with the merchant's financial institution ... but I have seen nothing that seems to support that scenario). Now, going back to well before the rise of PKI to address the offline trust scenario ... the payment card industry had online transactions that went from the merchant through a federated infrastructure all the way to the consumer's financial instititon and back as straight through processing. This included contractual trust establishment with various kinds of obligations and liabilities ... that included the consumer's financial institution assuming certain liabilities on behalf of the consumer and the merchant's financial institution assuming certain liabilities on behalf of the merchant. Possibly because of these obligations ... both financial institutions have interest in the transaction passing through them.

As mentioned before ... it appears that 3d secure doesn't eliminate the existing online real-time transaction that conforms to some significant contractual and liability obligations. 3d secure appears to add an additional, 2nd online transaction ... allowing the merchant to be directly in communication with the consumer's financial institution (bypassing the established contractual and liability obligations involving the merchant's financial institution). Furthermore, this 3d secure appears to include a PKI certificate ... targeted at establishing trust where the relying party has no other recourse for trust establishment. However, the merchant is already covered under the contractual trust operations that have been standard business practice for decades.

So what possible motivation is there for a merchant to add additional overhead and processing(*?*).

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: The Worth of Verisign's Brand
Newsgroups: netscape.public.mozilla.crypto
Date: 26 Jun 2005 09:36:46 -0600
"Anders Rundgren" writes:
I don't believe in that model anymore. 3D offers so much more possibilities for integration in purchasing systems which the classic model cannot do. Neither can AADS. It is like "federation" for payments.

as an aside ... the threat and vulnerability analysis done by the financial standards x9a10 working group (charged with preserving the integrity of the financial infrastructure for all retail payments) in the mid-90s, didn't find issues with trust, business, and/or message flows of the existing iso 8583 standard transactions.

the issue in the x9.59 standards word
https://www.garlic.com/~lynn/x959.html#x959
https://www.garlic.com/~lynn/subpubkey.html#privacy

was the treats and vulnerabilities in the authentication technology and that the integrity level has possibly eroded over the past 30 years or so (in the face of technology advances).

the primary issue was the authentication of the consumer for the transaction. this cropped up in two different aspects

1) is the consumer originating the transaction, really the entity that is authorized to perform transactions against the specific account

2) somewhat because of authentication integrity issues, starting at least in the 90s, there was an increase in skimming and harvesting ... either direct skimming of the magnetic stripe information or harvesting of account transaction databases ... both supporting later counterfeiting activities enabling generation of fraudulent transactions
https://www.garlic.com/~lynn/subintegrity.html#harvest

the countermeasure corner stones of x9.59 then became:

1) use technology for drastically increasing the authentication strength directly associated with transactions ... as a countermeasure to not being sure that the entity originating the transaction is really the entity authorized to perform transactions for that account.

2) business rule that PANs (account numbers) used in strongly authenticated transactions aren't allowed to be used in poorly or non-authentication transactions (or don't authorize poorly authenticated transaction having a PAN that is identified for use only in strongly authenticated transactions). this is a countermeasure to the skimming/harvesting vulnerabilities and exploits.

there was a joke with regard to the second countermeasure corner stone that you could blanket the world in miles deep cryptography and you still couldn't contain the skimming/harvesting activities. the second corner stone just removes skimming/harvesting as having any practical benefit in support of crooks and fraudulent transactions. slightly related post on security proportional to risk:
https://www.garlic.com/~lynn/2001h.html#61
recent similar posting in another thread:
https://www.garlic.com/~lynn/2005k.html#23
https://www.garlic.com/~lynn/aadsm16.htm#20

having helped with the deployment of the e-commerce ssl based infrastructure
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3

we recognized a large number of situations where PKIs that had originally been designed to address trust issues between relying parties and other entities that had no prrevious contact ... were being applied to environments that had long term and well established trust and relationship management infrastructures (aka if one has a relationship management infrastructure that provides long-term and detailed trust history about a specific relationship ... then a PKI becomes redundant and superfluous as a trust establishment mechanism).

In the AADS model
https://www.garlic.com/~lynn/x959.html#aads
involving certificate-less public key operation
https://www.garlic.com/~lynn/subpubkey.html#certless

we attempted to map publickey-based authentication technology into existing and long-term business processes and relationship management infrastructures.

the existing authentication landscape is largely shared-secret based
https://www.garlic.com/~lynn/subintegrity.html#secret

where the same information that is used for originating a transaction is also used for verifying a transaction. this opens up harvesting vulnerabilities and treats against the verification repositories.

basically asymmetric cryptography is a technology involving pairs of keys, data encoding by one key is decoded by the other key.

a business process has been defined for asymmetric cryptography where one of the key pair is designated "public" and can be widely distributed. The other of the key pair is designated "private" and kept confidential and never divulged.

a futher business process has been defined callrf "digital signatures" where a hash of some message or document is encoded with a private key. later a relying party can recalculate the hash of the same message or document, decode the digital signatures with the corresponding public key and compare the two hashes. if the two hashes are the same ... then the relying party can assume:

1) the message/document hasn't been modified since being digitally signed

2) something you have authentication, aka the originating entity has access to, and use of the corresponding private key.

an additional business process was created called PKIs and certification authorities that was targeted at the environment where a relying party is dealing with first time communication with a stranger and has no other recourse for trust establishment about the total stranger. note however, that PKIs and certification authorities can be shown to be redundant and superfluous in environments where the relying party has long established business processes and trust/relationship management infrastructures for dealing with pre-existing relationships.

However, just because PKIs and certification authority business process can be shown to be redundant and superfluous in most existing modern day business operations ... that doesn't preclude digital signature technology being used (in a certificate-less environment) as a stronger form of authentication (relying on existing and long estasblished relationship management processes for registering a public key in lieu of shared-secret based authentication material).

leveraging long established relationship management infrastructures for registering public key authentication material in lieu of shared-secret authentication material (and use of public key oriented authentication) is a countermeasure to many kinds of harvesting and skimming vulnerabilities and threats. Many of the identity theft reports result from havesting/skimming of common, static, shared-secret authentication material for later use in fraudulent transactions. The advantage of public key based authentication material, is that while it can be used for authentication purposes, it doesn't have the short-coming of also being usable for originating fraudulent transactions and/or impersonation.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

The Worth of Verisign's Brand

Refed: **, - **, - **, - **, - **
From: lynn@garlic.com
Newsgroups: netscape.public.mozilla.crypto
Subject: Re: The Worth of Verisign's Brand
Date: Mon, 27 Jun 2005 08:45:04 -0700
Anders Rundgren wrote:
Absolutely! However, there is no infrastructure in place for that.

Another reason for the PKI solution is that the financial sector (which you always refer to) has turned out to be the only remaining survivor on the client certificate market here not counting low-value e-mal certs.

The market is mainly consisting of governments who desperately need to reduce costs for administration. It is hard to see that anything but a 1-to-many ID TTP solution would fit that scenario.

But this PKI is usually based on contracts so it fits your view on how a CA should operate. I believe both banks and governments should go to an open subsriber-based model as it will long-term be most profitable/cheapest. That is, CA liability is IMHO an overrated issue. By having banks use their own stuff, they have all the reasons for doing the right thing.

Assume you are losing your ID on the wrong side of the globe. How would anybody but the financial sector be able to handle this? VeriSign? Not a chance.


there are several infrastructures in place for that.

in the mid-90s, one of the pki oriented payment structures had the financial insitutions registering public keys and issuing relying-party-only certificates.

the issue wasn't with the registering of the public keys ... since the financial insitutions have well established relationship management infrastructures.

the problem was trying to mandate that simple improvement in authentication technology be shackled to an extremely cumbersome and expensive redundant and superfluous PKI infrastrucutre.

the other issue ... was that the horribly complex, heavyweight and expensive PKI infrastructure had limited their solution to only addressing evesdropping of transactions in-flight ... which was already adequately addressed by the existing e-commerce SSL solution
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3
https://www.garlic.com/~lynn/subpubkey.html#sslcert

and was providing no additional improvement in the integrity landscape.

so you have a simple and straight-forward mechanism for minor technology improvement in authentication ... schackled to a horribly complex, expensive, redundant and superfluous PKI operation which was providing no additional countermeasures to the major e-commerce threats and vulnerability (than the existing deployed SSL solution).

Now if you were a business person and was given an alternative between two solutions that both effectively addressed the same subset of e-commerce vulnerabilities and threats ... one, the relatively straight-forward and simple SSL operation and the other a horribly complex, expensive, redundant and superfluous PKI operation .... which would you choose?

An additional issue with the horribly complex, expesnive, redundant and superfluous PKI based solutions were the horrible payload bloat represented by the relying-party-only certificates
https://www.garlic.com/~lynn/subpubkey.html#rpo

was that the typicaly payment message payload size is on the order of 60-80 bytes ... the attachment redundant and superfluous relying-party-only digital certificates represented a payload size on the order of 4k-12k bytes .... or a horrible payload bloat increase by a factor of one hundred times.

As mentioned in the previous posting,
https://www.garlic.com/~lynn/2005l.html#22 The Worth of Verisign's Brand
the x9a10 financial standard working group which was tasked with preserving the integrity of the financial infrastructure for all retail payments actually attempted to address major additional threats and vulnerabilities with x9.59
https://www.garlic.com/~lynn/x959.html#x959

and there was actually a pilot project that was deployed for iso 8583 nacha trials ... see references at
https://www.garlic.com/~lynn/x959.html#aads

part of the market acceptance issue is that the market place has been so saturated with PKI oriented literature .... that if somebody mentions digital signature ... it appears to automatically bring forth images of horribly expensive, complex, redundant and superfluous PKI implementations

The Worth of Verisign's Brand

From: lynn@garlic.com
Newsgroups: netscape.public.mozilla.crypto
Subject: Re: The Worth of Verisign's Brand
Date: Mon, 27 Jun 2005 09:18:39 -0700
Anders Rundgren wrote:
Absolutely! However, there is no infrastructure in place for that.

the issue with x9.59
https://www.garlic.com/~lynn/x959.html#x959
and aads
https://www.garlic.com/~lynn/x959.html#aads

is that there is absolutely no changes to existing infrastructures, business processes and/or message flows ... they all stay the same ... there is just a straight-forward upgrade of the authentication technology (while not modifying existing infrastructures, business process, and/or message flows).

aggresive cost optimization for a digital signature only hardware token would result in negligiible difference between the fully-loaded roll-out costs for the current contactless, RFID program and the fully-loaded costs for nearly identical operation for a contactless, digital signature program.

the advantage over some of the earlier pki-oriented payment rollouts
https://www.garlic.com/~lynn/2005l.html#23

is that in addition to addressing evesdropping vulnerability for data-in-flight (already addressed by the simpler SSL-based solution) ... it also provides countermeasures for impersonation vulnerabilities as well as numerous kinds of data breach and identity theft vulnerabilities.

is that in addition to addressing evesdropping vulnerability for data-in-flight (already addressed by the simpler SSL-based solution) ... it also provides countermeasures for impersonation vulnerabilities as well as numerous kinds of data breach and identity theft vulnerabilities.
https://www.garlic.com/~lynn/2005l.html#22

PKI Crypto and VSAM RLS

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **
From: lynn@garlic.com
Subject: Re: PKI Crypto and VSAM RLS
Newsgroups: bit.listserv.ibm-main
Date: 29 Jun 2005 16:31:18 -0700
Hal Merritt wrote:
We are a two LPAR basic sysplex with XCF and a JES MAS. Z/os and z/os.e 1.4 on a z/890. Symmetric key crypto is mission critical for our most loved online. Failure requires up to date resumes.

the basic technology is asymmetric key cryptography .... where there are pairs of keys .... and what one key encode, the other key decodes (as opposed to symmetric key cryptography where the same key is used for both encoding and decoding).

a (public key) business process has been defined where one key is identified as public and made readily available. the other of the key pair is identified as private and kept confidential and never divulged. public keys can be registered in place of pin, passwords, and/or other shared-secrets for authentication. in a shared-secret environment .... somebody having access to the registered authentication information also has access to the same information that is used for origination and therefor can impersonate. in public key environment, somebody with access to the public key can only authenticate but can't also use the public key to impersonate.

there is additional business process that has been defined called digital signatures. in a digital signature, the originator computes the hash of a message and encodes it with their private key. they then transmit the message and the digital signature. the receiver then recomputes the hash of the message, decodes the digital signature with the public key (producing the original hash) and compares the two hashes. If they are equal, the recipient then has some assurance that 1) the message hasn't been altered and 2) authenticates the sender.

another business process has been defined called PKI (Public Key Infrastructures and involves certification authorities (CAs) and (digitally signed) digital certificates. This was somewhat targeted at the offline email environment of the early 80s; somebody dialed their local (electronic) postoffice, exchanged email, and hung up. They might now have to deal with a first time communication from a total stranger that they had never previously communicated ... and they had neither local resources and/or access to online resources as a basis for establishing trust in the total stranger (aka they couldn't call up credit bureaus, financial institutions, etc).

Basically the local user (or relying party) has a local trusted repositories of public keys .... public keys belonging to entities that they already trust. In PKI, this local trusted public key repository is extended to include the public keys of certification authorities (or CAs). CAs register the public keys and other information about individual entities. They then create something they call a "digital certificate" which is a message containing the entity's registered information (including their public key) and is digitally signed with the CA's private key.

Now, a total stranger originating some message for first time communication, can digitally sign the message; sending off a combination of the basic message, their digital signature, and the digital certificate that has been issued to them.

The recipient receives the first time communication from a total stranger, validates the attached digital certificate (by using the CA's public key from their trusted public key repository), extracts the stranger's public key from the digital certificate and validates the message digital signature ... and then processes the message. They can trust that the message originated from some entity which is described by the certified information in the attached digital certificate.

In the early 90s, there were PKI x.509 identity certificates where the CAs were pondering overloading the certificate with lots of personal information ... since they couldn't necessarily predict what kind of information future relying parties might be interested in (the more relying parties that found the x.509 certificates useful ... the more valuable the x.509 certificates and possibly the more that the CAs might be able to charge for the certificates).

In the mid-90s, some number of institutions were starting to realize that x.509 identity certificates grossly overloaded with personal information represented significant privacy and liability issues. As a result, there was the introduction of replying-party-only certificates
https://www.garlic.com/~lynn/subpubkey.html#rpo

which just contained some sort of database lookup identifier (like an account number) and a public key. However, it is trivial to show that relying-party-only certificates are redundant and superfluous since it implies that the recipient already has a repository of all the necessary information and therefor doesn't require a digital certificate (which was designed to address the first time communication with a stranger where the recipient had no other available recourse to information about the originator).

Futhermore, with the Internet becoming more pervasive and ubiquitous, the situations where a recipient doesn't have other (real-time and online) avenues for information about stranger (in a first time communication) is rapidly dwindling. For some of the PKI CAs, they've attempted to move into the no-value market segment ... where a recipient might have available means to obtain information about a stranger .... however, the value of the operation doesn't justify the expense. A somewhat secondary issue is that as the Internet becomes more and more pervasive .... the cost of using it is also repidly declining ... further squeezing the no-value market segment where the recipient can't justify accessing realtime, online information.

ESCON to FICON conversion

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: ESCON to FICON conversion
Newsgroups: bit.listserv.ibm-main
Date: 29 Jun 2005 18:04:32 -0600
nigel.salway@ibm-main.lst (Salway, Nigel) writes:
I am looking to scope out a possible conversion from ESCON to FICON on a Shark I manage. There are currently 4 ESCON channels EMIFed to 4 LPARs on one CPU. If this were to converted to FICON channels, approximately how many would I need? With only one CPU in the complex, can I connect the channels natively or shoulr I look at a FICON director. I don't currently use an ESCON director.

escon is a half-duplex implementation (somewhat emulating bus&tag operation) that was kicking around POK since around the late 70s. while it is rated somewhere around 17mbyte/sec transfer ... the half-duplex latencies can cut into that.

one of the rs/6000 engineers took a look at escon ... and somewhat modified it, increasing bit rate from around 200mbit/sec to 220mbit/sec, using significantly cheaper drivers. This was released with the original rs/6000 as serial link adapter (SLA).

we had been working with several labs. and industry standard groups, LLNL was somewhat taking a high-speed serial copper installation they had and was pushing it in the standards groups as fiber channel standard (at the time, fiber, 1gbit/sec, full-duplex, 2gbit/sec aggregate ... with none of the additional thruput degradation associated with the latencies involved in turning around a half-duplex connection).

The SLA engineer had started work on a 800-mbit (per second, full duplex) version ... but we managed to convince him to join the fiber channel standard work (where he became editor of the fiber channel standard document). By at least 1992, you were starting to see FCS connectivity.

One of the issues in the FCS standards group were some of the battles where traditional mainframe, half-duplex oriented engineers were attempting to layer half-dupliex "mainframe channel i/o" protocols on top of the underlying full-duplex fiber channel standard.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

How does this make you feel?

From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: How does this make you feel?
Newsgroups: comp.arch
Date: 30 Jun 2005 09:27:35 -0600
"John Mashey" writes:
This has no restrictions of alignment, and barely any of size (16MB), and survives exceptions without weird extra state. These isntructions essentailly use register-pairs as byte-string-descriptors, and are relatively straightforward to use.

the compare version is: CLCL Ra, Rb compares long strings but they left the logical operators out [no NCL, OCL, XCL]

That gives "memcmp" directly.


the 360 instructions would check start & end locationsfor fetch and store violation before starting the instruction (fetch and store protection specification were on 2k aligned bounduries), lengths in these instructions were never more than 256 ... so worst case fetch/store protection areas were the start and end. on 360/67 with virtual memory support and 4k pages ... the start and end locations were also pre-checked for available page (before starting instruction). the worst case on 360/67 was 8 virtual pages:

1) execute instruction that crossed 4k boundary (2 pages) 2) SS instruction (target of the execute) that crossed 4k boundary (2 more pages) 3) source location of the SS instruction that crossed page boundary (2 more pages) 4) target location of the SS instruction that crossed page boundary (2 more pages)

the interruptable "long" instructions (introduced with 370) were not defined as having all required storage locations to be pre-checked; they were defined as being able to check on a byte-by-byate basis and causing an interrupt (with updated register values which allowed for restarting the instruction). I was involved in shooting a microcode bug on 370/125 (& 370/115) where the microcoders had incorrectly checked starting and ending locations on long instructions before starting (if something was wrong with the ending location, it would interrupt before starting the instruction ... which was correct for the 360 instructions but incorrect for the 370 "long" instructions).

in a recent discussion on this subject ... it has been brought to my attention that more recent machines have fixed a "bug" in the (original 360) translate SS instructions. translate instructions take a 256 character "table" that is used for changing or testing the source string. standard 360 involved checking the table starting address and the table ending address (start+256). However, a programmer that knew that they had a constrained set of characters in the input stream were allowed to define "short" tables (less than 256 bytes). However, the original instruction implementations would check worst case table ending address (start+256). the instruction bug fix is that if the start of a table is within 256 bytes of a boundary, the instruction is pre-executed, checking each byte in the input string for possible values that would address table byte on the other side of the boundary (aka the translate instructions took each input byte and added its value to the table start address to index a one byte field).

some recent postings
https://www.garlic.com/~lynn/2005j.html#36 A second look at memory access alignment
https://www.garlic.com/~lynn/2005j.html#38 virtual 360/67 support in cp67
https://www.garlic.com/~lynn/2005j.html#39 A second look at memory access alignment
https://www.garlic.com/~lynn/2005j.html#40 A second look at memory access alignment
https://www.garlic.com/~lynn/2005j.html#44 A second look at memory access alignment
https://www.garlic.com/~lynn/2005k.html#41 Title screen for HLA Adventure? Need help designing one

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

IBM/Watson autobiography--thoughts on?

From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: IBM/Watson autobiography--thoughts on?
Newsgroups: alt.folklore.computers
Date: 30 Jun 2005 16:19:01 -0600
rpl writes:
It is in respect that *all* farmers run their own business from buying feed/fertilizer to selling to a conglomerate for distribution; even the field hands are only one step away from management. City-types generally work for somebody else.

there has been some discussion about GDP/GNP ... a lot of aid is oriented towards improving GDP/GNP of "poorer" countries. an issue that has been under discussion is in some number of rural communities ... people consume what they grow and therefor it doesn't show in the GDP/GNP. The GDP/GNP efforts are oriented towards getting farmers to sell their produce to somebody .... and then turn around and buy it back for personal consumption .... which significantly increases the GDP/GNP bookkeeping ... compared to just consuming it directly.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

Importing CA certificate to smartcard

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: Importing CA certificate to smartcard
Newsgroups: netscape.public.mozilla.crypto
Date: 01 Jul 2005 09:20:05 -0600
"Vivek Chadha" writes:
"When I attempt to import the CA cert to the smartcard, why does'nt the NSS create a 'new' cert in the token store (on the smartcard)? Instead, you just see a copy over of the CA cert to the token store"

I think the reason is that when you attempt to move a CA cert, you do not have its corresponding private key and the NSS will not let you 'own' the CA cert. The smartcard is a secure storage for only the certs that the individual entity owns. The 'public' CA cert is exactly that...'public' so there is no reason to move it elsewhere.


hardware token is nominal a secure storage for private key ... from PAIN
• privacy (sometimes CAIN & confidentiality)
• authentication
• integrity
non-repudiation


you are specifically looking at privacy and confidentiality for the private key.

asymmetric key cryptography is the technology ... what one key of a key pair encodes, the other of the key pair decodes (differentiated from symmetric key cryptography where the same key is used for encoding and decoding).

there is a business process defined called
public key

, where one of the keys (of a key pair) is labeled "public" and freely distributed. The other key is labled "private" and is kept confidential aned never divulged.

layered on this is another business process called
digital signature

. the originator computes a hash of a message, encodes the hash with their private key. they then transmit the original message and the attached "digita signature". the recipient decodes the digital signature with the public key, recomputes the message hash and compares the two hash values (the recomputed and the decoded). if they are equal, then the recipient can assume 1) the message hasn't been altered and 2) something you have authentication (aka the sender has access to and use of the corresponding private key) ... aka from 3-factor authentication
something you have
something you know
something you are


most business operations have long standing and well established relationship management infrastructures. they use such relationship management infrastructures to record things about the relationship (address, current account balance, permissions, etc) as well as authentication material ... in the past, frequently shared-secret (mother's maiden name, SSN, pin or password). however, it is also possible to use well established and long standing relationship management infrastructures to also record public key as authentication material. the advantage that public keys have over shared-secrets is that with shared-secrets people with access to the relationship management infrastructure can also use the shared-secret authentication material for impersonation. the public key can only be used for authentication (and not impersonation). there was a report in the last year or so that something like 70 percent of account/identity theft involves insiders.

business processes also have a requirement that their relationship management repository has integrity (from PAIN) .... aka that they can trust the information contained in the relationship management repository (in addition, there may be confidentiality requirements if the repository contains shared-secret authentication material, since that information could be used to impersonate).

there are additional business processes which has an original design point of offline email from the ealy 80s. this has several pieces; digital certificates, certification authorities, PKI, etc. In the early 80s, a recipient dialed their local (electronic) post office, exchanged email and hung up. they now were potentially faced with processing first time communication from a total stranger (and had no recourse to local or other information about the total stranger).

The total stranger registers their public key and other information in a relationship management infrastructure run by a certification authority. The certification authority then creates specially formated a digitally signed message called a digital certificate (containing the registered information). Now a stranger, with first time communication, creates a message, digitally signs it and sends off the message, the digital signature and the digital certificate.

first off, a recipient has expended their trusted repository to include the authentication public keys of some number of these certification authorities. when the recipient receives a first time, digitally signed message from a total stranger ... with an attached digital certificate ... they then can use the CA's public key (from their trusted repository) to validate the (digital signature on the) digital certificate. then they can use the sender's public key from the digital certificate to validate the message's digital signature. they use the additional information contained in the digital certificate (copied from the certification authority's relationship management repository) in the processing of first time communication from a total stranger. this is an alternative to the recipient already having the stranger's public key directly registered in the recipient's relationship management infrastructure.

the digital certificate format is sometimes convenient for transporting CA public keys ... with the objective that they are sent to a relying party who then can load the associated public key into their trusted relationship repository. they typically aren't digital certificates in the business process definition since they are frequently self-signed and the receiving relying party must perform some other process before deciding to load the associated public key into their trusted relationship repository.

In the early 90s, you were finding some certification authorities looking at grossly overloading x.509 identity certificates with personal information (because they weren't able to predict who the recipients were going to be and/or what their requirements might be when dealing with total stranger). In such PKI/CA business process, these recipients are also referred to as relying parties (because they are relying on the information supplied by the certification authorities in the digital certificate).

In the mid-90s, you started to see some institutions realizing that x.509 identity certificates, grossly overloaded with personal information, represented significant privacy and liability issues. In this time-frame you saw some retrenchment to relying-party-only certificates. These relying-party-only certificates effectively contained some sort of database lookup index (like an account number) and a public key. Howevere it is trivial to show that such relying-party-only certificates are redundant and superfluous i.e. by definition, the relying party already has access to all the necessary information in their relationship management infrastructure.

In some PKI/CA payment initiatives from the mid-90s, they also found out that such relying-party-only certificates represented severe payload bloat. The typical payment message size is on the order of 60-80 bytes while the relying-party-only certificate overhead ran 4k-12k bytes. Not only was it trivial to show that such relying-party-only certificates were redundant and superfluous, but they also represented a factor of 100 times increase in payload bloat (which is pretty significant for something not serving any useful purpose).

as addenda ... with the drastic cost reductions for online connectivity and the ubiquitous availability of the internet, some of the CA/PKI operations are looking at moving into the no-value market segment (since the offline market segment where the relying party had no other recourse to information about first time communication with a stranger, is rapidly disappearing). the no-value market segment is where the relying party can't justify the cost of an online transaction to determine information about first-time communication with a stranger. One issue is that the no-value market segment probably isn't looking at spending a great deal of money on digital certificates.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

A good argument for XML

From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: A good argument for XML
Newsgroups: comp.databases.theory
Date: 01 Jul 2005 10:47:52 -0600
Gene Wirchenko writes:
The problem has been solved before. XML is solving old problems.

gml was invented in '69 at the science center
https://www.garlic.com/~lynn/subtopic.html#545tech

by "G", "M", and "L". and gml support added to the existing cms script processing command. later in the 70s, it was standardized in iso as SGML
https://www.garlic.com/~lynn/submain.html#sgml

also in somewhat the same mid-70s time-frame ... the original relational, sql implementation (system/r) was done on the same platform at sjr
https://www.garlic.com/~lynn/submain.html#systemr

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

More Phishing scams, still no SSL being used

Refed: **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: More Phishing scams, still no SSL being used...
Newsgroups: netscape.public.mozilla.crypto
Date: 01 Jul 2005 12:17:14 -0600
"Vivek" writes:
Would'nt a time of expiry be more relevant (to a client) as part of the ocsp response (just wondering if there was a reason that this was already considered and rejected?)

the issue is that PKI, certification authorities, digital signatures, etc. were invented to address the offline trust problem ... where a relying party had not access to information in first time communication about a stranger.

OCSP sort of came on the scene in the mid-90s after I was pointing out that suggestions regarding converting the payment card network to "modern" PKI was actually a technology regression of 20 or more years.

the credit card industry was doing offline processes with plastic credentials and monthly invalid account booklets mailed to all merchants every month, then weekly, then possibly looking at printing tens of millions of invalid account booklets and mailing them out every day.

so instead, they transition to online transactions by adding magstripe to existing plastic credential. now rather than relying on stale, static credential information, they could do real live, online transaction (and poof goes the problem of mailing out tens of millions of account invalidation booklets every couple hrs).

the observation is that OCSP goes to all the overhead and expense of having an online transaction ... but actually is returning very little useful information.

If you started suggesting that OCSP should start returning actual, useful information, then somebody might conclude that you get rid of the certificates all together and just go to a real online transaction (instead of a psuedo offline infrastructure with most of the downside of being offline but having most of the overhead of also having online transaction).

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

More Phishing scams, still no SSL being used

Refed: **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: More Phishing scams, still no SSL being used...
Newsgroups: netscape.public.mozilla.crypto
Date: 01 Jul 2005 14:30:05 -0600
Ram A Moskovitz writes:
Are you sure OCSP didn't come out of the IETF as an effort to 'standardize' the VeriSign / Microsoft certificate status service that was launched as an extension to ActiveX?

it may have ... but i know there were people starting to discuss some sort of real-time rube goldberg contraption that attempted to preserve the facade of an offline operation with throwing in minimum of online operation.

i was getting hit with wouldn't it be a modern marvel to convert the payment infrastructure to certificates ... and then pointing out to them, that conversion to an offline certificate-based operation would actually represent regressing the payment infrastructure by at least 20 years.

when we were doing this thing with this small client/server startup that wanted to do payments
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3

we had to do due diligence on the major operations that were going to be issuing these things called SSL domain name certificates in support of the operation. we were constantly pointing out that most of them were actually doing certificate manufacturing (a term we coined at the time) and hadn't actually bothered to implement a real PKI that actually administered and managed the infrastructure. furthermore, the payment infrastrucutre had learned at least 25 years earlier that revokation lists scaled extremely poorly.

the following is from the PKIX WG minutes apr 7-8 1997
Sharon Boeyen presented the work to date on Part 2 regarding the use of LDAP and FTP for retrieval of certificates and CRLs and the requirements for and specification of an Online Certificate Status Protocol (OCSP).

DISCUSSION

1 - Should we consider splitting the document into two separate ones, since the OCSP is a new protocol definition which may require significant more review and discussion than the LDAP and FTP profiles?

Resolution: Although we agree that OCSP may require additional review, the document will remain a single draft and we will re-address this issue, if the OCSP discussion is such that it will require a longer review period and impede progression of the remainder of the document.


.....

I have the original email ... but it can also be found here
http://www.imc.org/ietf-pkix/old-archive-97/msg00316.html

some comment about the lead architect for ocsp was from valicert
http://www.rsasecurity.com/press_release.asp?doc_id=334&id=1034

in addition to ocsp ... about the same time there were some other infrastructures looking at various gimmicks to improve the revokation process. note in the following announcement ... they were almost quoting me word-for-word about how archaic the CRL process actually is.
Date: Wed, 29 Oct 1997 21:05:12 -0800

SUBJECT: VALICERT TACKLES FLAW IN E-COMMERCE SECURITY

A group of Silicon Valley entrepreneurs has set out to correct a flaw in the digital certification process that many Internet experts have been counting on to make Internet commerce secure.

The solution, called a certificate revocation tree, is the property of Valicert Inc., a Sunnyvale, Calif., company formed last year and officially opened for business this week.

In a sign that Valicert may be on to something that could bring added security to Internet transactions, three vendors in the data encryption field have given endorsements, and Netscape Communications Corp. has made a provision for Valicert's technology to "plug in" to the SuiteSpot server software.

The advent of Valicert indicates that digital certification-a cryptographic technique that is believed to be on the road to broad public acceptance through Internet security protocols such as the credit card industry's SET-needs further refinement. "Today there is no way to know if a certificate is valid at the time of a transaction-it is known only that the certificate was valid at the time of issuance," said Joseph "Yosi" Amram, president and chief executive officer of Valicert.

He said that if not for the Valicert method of keeping revoked certificates from being approved-it will be available in the form of a tool kit for system developers, a server system, and a service from Valicert-electronic commerce could collapse under the weight of millions of digital certificates that cannot be adequately validated. SET, the Secure Electronic Transactions protocol adopted by MasterCard and Visa for on-line credit card transactions, illustrates the problem in the extreme. SET requires issuance of digital certificates to all parties to a transaction. They are the E-commerce equivalent of a driver's license to verify a cardholder's identity or a certification that an on-line merchant is what it claims to be. The complexity of processing transactions with those multiple certificates is widely seen as slowing the adoption of SET. But digital certificates have already been issued by the millions through Netscape and Microsoft Corp.'s Internet browsers. Verisign Inc. and GTE Corp. are prominent certificate vendors. GTE, Entegrity Solutions, and Entrust Technologies, the leader in public key infrastructure systems, have each agreed to some form of collaboration with Valicert.

Valicert's efforts can "expand the security infrastructure available for commerce," said Tom Carty, vice president of marketing and business development at GTE. "Given our focus on providing all of the pieces of the infrastructure required to make Internet commerce possible, it makes great sense for us to partner with Valicert to fill in one of the most essential pieces of the infrastructure puzzle-the digital credential checkpoint."

In a recent interview, Mr. Amram and Valicert chairman Chini Krishnan said the problem is akin to what the credit card industry faced before electronic authorization systems.

"A merchant would get a book, which came once a week or once a month, full of bad credit card numbers, and credit cards presented at the point of sale would have to be looked up manually," said Mr. Amram, who joined Valicert in August after being involved in other high-tech start-ups and in the Silicon Valley venture capital scene. "It was a big hassle and it slowed down checkout."

The digital certificate equivalent of the hot-card list is known as the certificate revocation list, or CRL.

Mr. Krishnan, the Valicert founder, said CRLs are "unscalable," meaning they become cumbersome, if not impossible, to manage as they approach mass-market proportions. The lack of scalability "has posed a barrier to widespread deployment," Mr. Krishnan said. He claimed that the invention of the certificate revocation tree brings a "1,000-to-1 advantage" that solves the problem of revocation and validation in a tamper-proof and economical way.

"Developers need a cost-effective, one-step solution for building applications that can check the validity of digital certificates," Mr. Amram said. "By providing a clearing house network into multiple certification authorities, and by delivering a robust technology combined with a liberal licensing policy, Valicert will enable the widespread development and use of applications that will make the Internet and corporate intranets safe to conduct business."

"Certificates are the only way to deal with identity in any meaningful way," Mr. Amram said. "They will take off in a big way. But certificates without validation are like a car without brakes."

Mr. Krishnan said the development of Valicert's technology had "a lot of rocket science elements," which is why it took the company 20 months to reach the launch stage. Enhancing its credentials, Paul Kocher, a leading cryptography researcher, is credited with inventing the underlying technology. Martin Hellman, a Stanford University professor and half of the Diffie-Hellman team that invented public key cryptography, is on Valicert's scientific advisery board.

Commercializers of cryptographic security have been intrigued by Valicert's proposition. When he heard about it during American Banker's Online '97 conference in Phoenix, Scott Dueweke, a marketing manager in International Business Machines Corp.'s Internet division, said, "They should call us."

Another expert, who asked not to be identified, said Valicert's biggest problem is that it is a few years ahead of its time. "The market has fallen down with respect to revocation management, relying on relatively short expiration dates" to minimize invalid certificates, said Victor Wheatman, a California-based analyst with Gartner Group, Stamford, Conn. "Valicert fills a void and hopes to develop technology before the leading players move forward with their own revocation capabilities."

Valicert's server and tool kit are available now, and its service to certificate acceptors will enter field trials later this year, the company said. The tool kit can be downloaded from the valicert.com Web site free for noncommercial use and evaluation purposes. Application development licenses are a flat $995 with unlimited sublicense rights. The server can be deployed on corporate intranets for $9,995.


--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

More Phishing scams, still no SSL being used

Refed: **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: More Phishing scams, still no SSL being used...
Newsgroups: netscape.public.mozilla.crypto
Date: 01 Jul 2005 14:53:49 -0600
Anne & Lynn Wheeler writes:
"Certificates are the only way to deal with identity in any meaningful way," Mr. Amram said. "They will take off in a big way. But certificates without validation are like a car without brakes."

of course the above quote was left over from the early 90s and the x.509 identity certificates ... that by the mid-90s were in danger of being overloaded with enormous amounts of personal information .... and you were starting to see some infrastructures moving to relying-party-only certificates
https://www.garlic.com/~lynn/subpubkey.html#rpo

containing little more than some type of database lookup value (like account number) and the public key (as a way of dealing with the significant privacy and liability issues that go along with x.509 identity certificates containing enormous amounts of personal information).

part of the issue is that most business processes have well-established and long entrenched relationship management infrastructures ... that contains detailed and real-time information about the parties that they are dealing with. in such environments it was trivial to show that the relying-party-only certificates (indexing an online relationship management infrastructure containing the real information) were redundant and superfluous.

in fact, stale, static digital certificates of nearly any kind become redundant and superfluous when the business process has to deal with an established online, real-time relationship management infrastructure.

the target for digital certificates, PKIs, etc ... where the offline relying parties involved in first-time communication with total strangers where they had no recourse to information about the party they were dealing with (sort of the letters-of-credit model from the sailing ship days).

as the internet becomes more ubiquitous, the offline market segment is rapidly disappearing. there has been some shift by PKI operations into the no-value market segment ... where the relying party can't justify the cost of an online transaction when first time interaction with strangers are involved. However, as internet becomes more and more ubiquitous, the cost of using the internet for online operations is also rapidly dropping ... creating an enormous squeeze on even the no-value market segments.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

More Phishing scams, still no SSL being used

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: More Phishing scams, still no SSL being used...
Newsgroups: netscape.public.mozilla.crypto
Date: 01 Jul 2005 15:43:00 -0600
oh, what the heck ... a little more fun with walk down memory lane (Anne & I even get honorable mention)
Date: Thursday, October 8, 1998
Subject: Privacy Broker: Likely Internet Role for Banks?

Bankers' payment-systems supremacy is facing yet another challenge, but this is one the industry may be uniquely qualified to take on.

Though the threat has a familiar ring -- more disintermediation as commerce goes electronic -- the logical response plays to two of banking's strengths: trust and security.

Bankers are beginning to grasp the possibilities of transferring and adapting those valuable assets to the Internet. They are even being egged on by technology experts who have ventured deep into cyberspace and found it lacking exactly what bankers are in a position to offer.

The job of trusted third party or certifying agent or privacy broker, as it is variously called, seems theirs for the taking. As guardians of on-line trust, they would manage the electronic credentials that assure that buyers and sellers are who they say they are -- and would get paid for it.

If only it were that simple.

Electronic commerce is developing according to rules, and with a set of technology requirements, that do not directly translate from the physical world. The trust and stability evoked by banks' offices, vaults, and, more intangibly, their brand names and risk management reputations require some degree of retooling.

Experts inside and outside the banking industry agree that at the operational core of on-line trust are the techniques of digital certification. It is the closest thing to signature verification that virtual-world technologists have come up with. In theory, when fully developed and appropriately deployed, this derivative of data encryption technology, binding intricate mathematical codes to a consumer's or company's identity, could be even more reliable and secure than written signatures.

That very theory is what the MasterCard and Visa networks are supposed to be proving with SET, the Secure Electronic Transaction protocol, which requires digital certificates for banks, merchants, and cardholders. A purchaser would use the digital code, rather than a card number, to initiate a transaction; the certificate would represent authentication by a bank or other certificate authority.

SET has gained some acceptance overseas but almost none in the United States, therefore contributing little to the mainstreaming of digital certification.

The concept has, however, attained some level of business-world consciousness through initial public offerings this year by two specialists in the field, Verisign Inc. and Entrust Technologies Inc. These companies and others stand ready to give banks the software or outsourcing support they need to authenticate people doing business facelessly on the World Wide Web.

Last month, working with the Zions Bancorp. affiliate Digital Signature Trust Co. of Salt Lake City, the American Bankers Association launched ABAecom, which hopes to take responsibility for the certification hierarchy for the entire financial services industry. It starts with a root key and cascades down to user certificates and digitally signed transactions.

The banker involvement is a sign that "e-commerce is starting to grow up," said Michael Cation, president of GlobeSet Inc., an Austin, Tex., software company active in SET and digital certificates. "Financial institutions are becoming more forceful," he said.

To wit, Bankers Trust Corp. and Chase Manhattan Corp. recently contributed to a second round of financing for GlobeSet. Vice chairmen George Vojta of Bankers Trust and Joseph Sponholz of Chase took seats on its board.

"I think 1999 will be the year of PKI [public key encryption infrastructure] in the financial services industry," said Scott Lowry, president of Digital Signature Trust.

But this is complicated business. Operationally, bankers have to learn an art and science that historically had more to do with military command and communications than with buying and selling.

To make a business out of it, they have to find a way to make money. And the uncertainties get wrapped up in "who controls the payment system?" and "are banks about to lose another of their bastions?"

"Some banks are very sophisticated in this area, putting a lot of resources into developing and understanding the business opportunities," said Elliott McEntee, president of the National Automated Clearing House Association, which sponsored a digital certificate test involving BankAmerica Corp., Citicorp, Mellon Bank Corp., and Zions Bancorp.

"Others don't see the product being used on a widespread basis for three, five, or seven years," he said. "They don't see a business case."

They see no compelling need to rush into activities that are in a state of developmental flux with no apparent revenue stream. But if, as research says, perceived insecurity is inhibiting electronic commerce, who better than bankers to fill the breach?

"This happens to be a remarkably mature technology," Frank Jaffe, applied technology consultant with BankBoston Corp., said of the PKIs -- public key infrastructures -- that underlie digital certificate operations.

"But the application of the technology, from a business perspective, is very immature," he said. "We will see serious changes in the business model as this goes forward."

Bankers have let too many of their dominant businesses slip away -- large-corporate lending, credit card processing, mortgage servicing -- not to be at least a bit uneasy that the pattern will repeat itself in Internet payments and security.

"No one knows if it is going to be successful," Mr. McEntee said. "But if it is, banks had better be in there, and in a big way."

Insurance, securities, and telecommunications companies and accounting firms may have their eye on certificate authority roles.

"The market will insist on privacy brokers," said Mitchell Grooms, co-founder of Secured Information Technology Inc., a year-old company crusading for what it considers a bank-centric trust model for the digital economy.

"Either the banks will create [the business], or somebody else will," he said. "It is what banks do, and they do it well."

Mr. Grooms' Los Angeles-based company, SITI, is one of a new breed with some new ideas for building business cases around the public key and certificate authority, or CA, infrastructures that some banks find uninviting or daunting.

Aside from a "strategic vision" of the way on-line transactions will evolve, SITI enters the fray with patents on elliptic curve cryptography and a budding relationship with the transaction processing giant First Data Corp.

SITI is not alone in championing elliptic curve, a method of data scrambling that, because of some inherent efficiencies, could pose a challenge to the algorithms associated with RSA Data Security Inc., the established leader in encryption technology. Elliptic curve has been more prominently associated with Certicom Corp. of Canada, which has licensed its system to companies that make compact and wireless devices and smart cards that cannot easily handle the long RSA encryption keys.

SITI claims some superiority over Certicom, and it will take time and the marketplace to render a verdict. But promoters of elliptic curve agree that it must come into play if digital certificates are ever to be stored in chip cards or "scale up" to customers and merchants numbering in the millions.

"A lot of people are rooting for [elliptic curve] because of the short keys," said Mr. Jaffe. But first it has to get through the stress testing by scientists and business developers that made the RSA methods as dependable as they are, and some standardization bodies still have to give their imprimatur.

"Elliptic curve has been around for years and has been tested quite thoroughly," said Henry Dreifus, an Orlando-based consultant. But in the formative market stages, "companies are not betting on just one technology. They are placing many bets. At some point somebody will blink and a given process will move ahead very fast. One could own the banking trade, or insurance, or telecommunications -- that industry has been tweaking elliptic curve for some time."

There are other streamlining measures.

Assuming commerce goes global, with certificates and associated digital signatures that must be exchanged among different certificate authorities, some type of cross-certification will be required. Nacha began to get at that through interoperability testing with Entrust, Verisign, Digital Signature Trust, Certco LLC, and GTE Cybertrust Solutions.

"Issuing a certificate is easy," said John Ryan, president of Entrust, a Richardson, Tex.-based spinoff of Northern Telecom of Canada. "You can do millions an hour on a relatively inexpensive server. It is the management of the digital ID that is hard and has to be automated." That includes knowing when a certificate, like a credit card account, has expired or must be revoked.

Valicert Inc. says the customary maintenance of certificate revocation lists, or CRLs, is too unwieldy for large-scale, mass-market operations. The Mountain View, Calif., company's alternative certificate validation system addresses that problem.

Diversinet Corp., another product of Canada's PKI ferment that, like Certicom, has set up shop in Silicon Valley, sweeps the revocation problem aside. It proposes issuing to an individual a single certificate for multiple uses. Authorizations or permissions are attached to that certificate for defined or limited purposes. Processing efficiencies are gained through not needing a CRL and by limiting the personal information attached to the certificate.

"It is just like going to an automated teller machine," said Diversinet president Nagy Moustafa. "If the transaction is on-line, you validate it on-line and don't need the overhead of a CRL."

That type of thinking has led to more radical suggestions -- a different type of certificate or a revised approach to the infrastructure.

Mr. Lowry of Digital Signature Trust said "thin or anonymous certificates" could find a niche, perhaps as an alternative to the slow-moving SET. The certificate is reduced to a number for transmission over the Internet, which provides a pointer to client information in a data base.

Lynn and Anne Wheeler, a husband-and-wife team of computer scientists, have shaken up the certificate authority establishment with their proposal for AADS, Account Authority Digital Signatures.

Veterans of "skunkworks" research and development at International Business Machines Corp., the Wheelers work in advanced technology development at First Data Corp. and spend a portion of their time on the road stumping for AADS and debunking the traditional CA-driven digital signatures -- at least as they apply to on-line commerce.

The certificate authority model, they maintain, was developed for off-line authentication of parties who may not know each other. For on-line dealings where a relationship is already established, they propose simplifying certificates by integrating them in financial account records.

The simplification lends itself to large-scale deployment, possibly aided by elliptic curve cryptography. The Wheelers warn bankers and others against getting a false sense of satisfaction from limited pilots based on old technology.

"If you are doing a small pilot for 1,000 customers, the costs are in tens of thousands of dollars, and it doesn't pay to modify legacy systems," Mr. Wheeler said in a recent interview. "Once you get into significant production" -- he said that could be 5% or more of a multimillion-customer account base -- "it becomes less expensive to modify the structure for all accounts than to maintain a parallel system" for digital signatures.

The Wheelers buttress their arguments with concerns about security and privacy when certificates carry a lot of personal information over the Internet, and they emphasize a business case, including compatibility with legacy systems and conventional payment processes.

They get a lot of philosophical agreement on the latter point.

Mr. Cation said it is an article of faith for his company, GlobeSet, that all products provide "secure access to the existing infrastructure of the financial institution." Banks essentially own "the four-corner transactional model" of customer and merchant, paying bank and receiving bank, which they can carry over to e-commerce.

"The right business model to use is the banking industry's, not the military's," Mr. Cation said.

William Crowell was steeped in hierarchical CAs when he was deputy director of the National Security Agency. Now vice president of Cylink Corp., an information security vendor in Sunnyvale, Calif., he said there will be limits to certificate authority scalability, and in many business settings "I will generally prefer to get certificates for special purposes."

In government settings, "there was always a final authority, a clear hierarchy," said Nicholas DiGiacomo, who recently left Science Applications International Corp. to join the Internet business consulting firm Scient Corp. of San Francisco. "A distributed model" is needed for business, but technologists came out of the military "doing what they knew how to do."

He said businesses will be reluctant to cede trust functions to third parties and will come to exchange assurances and manage risk much as they do with letters of credit.

"Maybe you and I would want to use something like SET for a few transactions," said Mr. Dreifus. But once the relationship is established, "we would not need a Visa or the post office" as CA, and exchanges would be much cheaper.

"Banks like the account authority structure, they identify with it immediately," said Mrs. Wheeler. "It is a bank-centric approach to electronic commerce. They recognize it when they see it."

"We don't say there is no purpose in certificates," Mr. Wheeler added. "But a lot of purposes are better served with an account-based infrastructure."

Like any scientific paradigm, the Wheelers' AADS is controversial and struggling to break out. Mr. Lowry pointed out that AADS "has not been embraced by the broader CA community" and even First Data Corp. is exploring multiple options.

Yet AADS has gained the status of a proposed industry standard, X9.59, and has gotten heard by the Bankers Roundtable's Banking Industry Technology Secretariat, Global Concepts Inc.'s Internet Forum, and various panels of cryptography experts.

"The Wheelers are basically saying you can get the benefits of digital signatures without all this infrastructure," said David Stewart, vice president of Atlanta-based Global Concepts. "Maybe these mega-CAs are not necessary. Maybe people should be thinking inside the box before they go outside."

He wrote a paper calling AADS "a brilliantly simple solution with potentially far-reaching implications for the payments system as a whole."

Meanwhile, the established technology is taking root, particularly for internal corporate and business-to-business needs, where it could catch on faster than consumer e-commerce and eventually spill over. Mr. Ryan of Entrust claimed he can deliver whatever speed, simplicity, and security the critics are calling for. One of his clients, Bank of Nova Scotia, has "scaled up" to 100,000 certificates and 50,000 active users, he said.

"These will coexist for a while," said Mr. Dreifus. "This is still a pre-industry in terms of consumer-level, everyday encryption. Nobody has figured out how to manage this big-number problem of keys and certificates and the controls needed to protect the entire infrastructure."

"People say the banks are slow, but they have to go through a certain due diligence," said Mr. Stewart. "What the Wheelers have done is, at the least, a good gut check."


--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

More Phishing scams, still no SSL being used

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: More Phishing scams, still no SSL being used...
Newsgroups: netscape.public.mozilla.crypto
Date: 02 Jul 2005 12:22:57 -0600
Anne & Lynn Wheeler writes:
That very theory is what the MasterCard and Visa networks are supposed to be proving with SET, the Secure Electronic Transaction protocol, which requires digital certificates for banks, merchants, and cardholders. A purchaser would use the digital code, rather than a card number, to initiate a transaction; the certificate would represent authentication by a bank or other certificate authority.

SET has gained some acceptance overseas but almost none in the United States, therefore contributing little to the mainstreaming of digital certification.

The concept has, however, attained some level of business-world consciousness through initial public offerings this year by two specialists in the field, Verisign Inc. and Entrust Technologies Inc. These companies and others stand ready to give banks the software or outsourcing support they need to authenticate people doing business facelessly on the World Wide Web.


two issues from the historical news article posting
https://www.garlic.com/~lynn/2005l.html#34 More Phishing scams, still no SSL being used

is that their "light" digital certificates are also referred to has relying-party-only certificates
https://www.garlic.com/~lynn/subpubkey.html#rpo

... since rather than carrying the actual information, they just carry some sort of index pointer into a business relationship management infrastructure (like account number). the relationship management infrastructure contains all the real information.

however, it is trivial to show that such "light" certificates are redundant and superfluous when the business process has to access the infrastructure containing the real information. this is also somewhat a trivial logic operation when you take the original design point for digital certificates is providing relying parties with information in an offline environment when the relying parties had no other recourse to the real information; aka therefor by definition, if the relying parties have access to the real information ... the original purpose and justification for digital certificates is invalidated.

the other issue is that even for "light" certificates the infrastructure overhead for appending certificates ran 4k to 12k bytes. when you are talking about the basic payment card infrastructure where typical message size is 60-80 bytes, the appended certificate paradigm represents an enormous payload bloat of one hundred times (two orders of magnitude) ... for otherwise redundant and superfluous certificates.

the basic technology is asymmetric key cryptography, where what one key (of a key-pair) encodes, the other of the key-pair decodes.

a business process has been defined called
public key

... where one of the key-pair is made freely available and the other key is identified as private and kept confidential and never divulged.

a further business process has been defined called
digital signature
... which represents something you have authentication ... from 3-factor authentication paradigm
https://www.garlic.com/~lynn/subintegrity.html#3factor
something you have
something you know
something you are


digital signatures implies that some entity has access and (presumably sole) use of a specific private key.

existing relationship management infrastructures can upgrade their shared-secret based authentication
https://www.garlic.com/~lynn/subintegrity.html#secret

to digital signature, by registering a public key in lieu of pin, password, ssn, date-of-birth, mother's maiden name, etc. while in secret-based infrastructures, the same value is used to both originate as well as authenticate. in public key scenario for digital signature, the public key is only used to authenticate (and can't be used to originate or impersonate).

from the PAIN security acronym
P ... privacy (or sometimes CAIN, confidential)
A ... authenticate
I ... integrity
N ... non-repudiation


it can be easily demonstrated that relationship management infrastructures tend to have very high integrity reguirements (regarding all details of the relationship as well as the authentication information).

however, many business infrastructures that make heavy use of their relationship management infrastructure for numerous business process are also at risk of exposing the authentication information. when this authentication information is public key, it tends to not be a big deal. however, when the authenticaton material are secrets, then there is an enormous privacy requirement (since obtaining the secrets also enables impersonation, fraudulent transactions, account fraud, etc).

Using secret-based authentication can create enormous dynamically opposing objectives for relationship management infrastructure ... on one hand the relationship management infrastructure has to be readily available in support of numerous business operations .... and on the other hand, the secret-based privacy requirements are none but extremely constrained business operations can access the information.

one such description is my old security proportional to risk posting
https://www.garlic.com/~lynn/2001h.html#63

and a whole host of postings on skimming and harvesting of (secret-based) authentication material (that can be leveraged to perform fraudulent transactions)
https://www.garlic.com/~lynn/subintegrity.html#harvest

as an aside ... there was a report within the past couple years that something like 70 percent of identity/account fraud involved insiders. there has been some additional, similar reports. there were a number of these news URLs in the past couple days:

Bank workers biggest ID theft threat; Insiders with access to data may pose 70% to 80% of risk
http://deseretnews.com/dn/view/0,1249,600145529,00.html

Banks Face Challenge In Screening Employees To Avert Inside ID Thefts
http://www.banktech.com/aml/showArticle.jhtml?articleID=164904297

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

More Phishing scams, still no SSL being used

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: More Phishing scams, still no SSL being used...
Newsgroups: netscape.public.mozilla.crypto
Date: 02 Jul 2005 13:29:32 -0600
Ram A Moskovitz writes:
Given that individiaul privacy doesn't have a directly measurable value to a business there is not the same motivation to protect it and so most individuals rely on the various community participants to do the right thing which, given the long feedback loop, is taking longer than many of us would want.

one of the reasons given for the migration away from x.500 and x.509 identity certificates in the mid-90s, where the enormous privacy and liability issues that they represented (if they contained enormous amounts of openly available) personal information). referenced in previous new postings
https://www.garlic.com/~lynn/2005l.html#34 More Phishing scams, still no SSL being used

however, it is trivial to show that if the relying-party is going to access some form of relationship management infrastructure containing all the real information, then any stale, static digital certificates are redundant and superfluous.

the issue of PKIs moving into no-value transactions .... is that a relationship management infrastructure typically contains a lot more timely and higher quality information for making decisions. If the infrastructure can justify the value of having higher quality online information ... then the PKIs and stale, static digital certificates are redundant and superfluous. That leaves PKIs looking for the rapidly shrinking markets where 1) the relying party can't access the real information directly (restricted to offline environment) and/or 2) the relying party can't justify the need to have direct and timely, higher quality information.

in the mid-90s, FSTC
http://www.fstc.org/

was in a quandary over FAST ... basically doing simple digitally signed transactions but expanding them to issue more than financial transactions. there are some implied reference to that opportunity in the old news posting
https://www.garlic.com/~lynn/2005l.html#34 More Phishing scams, still no SSL being used

in the x9a10 financial standard working group, we were charged with preserving the integrity of the financial infrastructure for all retail payments. the x9.59 standard was the result
https://www.garlic.com/~lynn/x959.html#x959

it basically is the minimum payload increase to existing payment messages. it can be mapped to iso 8583 debit, credit, and stored-value messages
https://www.garlic.com/~lynn/8583flow.htm

with the addition of a couple additional minimal fields and a digital signature (and no enormous payload bloat by appending stale, static, redundant and superfluous digital certificates).

The FAST scenario was basically to enable the asking of yes/no questions about things other than financial transactions (i.e. the title of the referenced news article: Privacy Broker: Likely Internet Role for Banks?).

For instance a merchant could ask if the person was of legal drinking age. There was no requirement to divulge the actual birthdate (birthdates are widely used as a means of authentication, so divulging birthdates represents an identity fraud threat).

The FSTC/FAST scenario was that there is a large and thriving internet business for age verification .... but it involved a segment of the internet business that many consider unsavory. However, the widespread deployed implementation was based on an intermediary doing a
"$1 auth"

credit card transaction as part of registration. The
"$1 auth"

would never be settled, so there was never any actual credit card charge (although your
credit limit

or
open to buy

would be decremented by a dollar for a couple days until the auth had expired). The theory was that a person had to be of legal age to sign a credit card contract, which in turn enabled them to do credit card transactions. There was a lot of money being made off of this
"$1 auth"

hack ... and only a very small amount going to the financial industry. Note however, FAST was never intended to only be limited to age verification ... but age verification was viewed as an already well-established market.

When we were called in to work on the cal. state and federal electronic signature legislation, one of the industry groups had done studies on the driving factors behind privacy regulation and legislation. The two main driving factors were 1) identity theft and 2) (institutional) denial of server; aka the primary driving factors weren't privacy itself ... it was the prospect of fraud and/or being denied a job or various kinds types of services.

so as implied in the previous post
https://www.garlic.com/~lynn/2005l.html#35
and the post on security proportional to risk
https://www.garlic.com/~lynn/2001h.html#63

many relationship management infrastructures have strongly conflicting business confidentiality objectives ... readily available for use by lots of business processes and at the same time not being available hardly at all because there is authentication information that can also be used to impersonate and originate fraudulent transactions.

harvesting of such repositories is frequently made easier because of the large number of different business processes that require access to the information (in some cases the transaction information even defines the authentication information).

going back to the secruity PAIN acronym
P ... privacy (or somethings CAIN, confidential)
A ... authentication
I ... integrity
N ... non-repudiation


the businesses tend to have a strong integrity requirement for their relationship management systems (various kinds of integrity issues like introduction of incorrect values can affect their bottom line). However, they tend to have a much lower privacy requirement for their relationship management systems (in part because a large number of different business processes require access).

When an insider swipes the information, they tend to go far away to do their account/identity fraud.

I'm also a co-author of the x9.99 financial industry privacy impact assessment (PIA) standard. Most companies understand using security (and frequently integrity) to protect themselves. However, it frequently takes a change in mindset to start using security (and frequently privacy) in the protection of others. minor note ... as part of x9.99, i also started a privacy taxonomy and glossary (trying to help organize how you think about privacy):
https://www.garlic.com/~lynn/index.html#glosnote

One of the issues with the posting on security proportional to risk ... is that even if you blanketed the earth under miles of cryptography, the current infrastructure still can leak information that can be used in account and identity fraud.

One of the things in the x9.59 standard
https://www.garlic.com/~lynn/x959.html#x959

was that it removed knowledge of an account number as point of compromise. Given that the account number is used in an enormous number of business processes ... trying to keep it confidential appears to be an impossible task. so x9.59 changed the rules, it made the account number useless to crooks for performing fraudulent transactions:

1) x9.59 transactions had to be strongly authentication
2) account numbers used in x9.59 transactions could not be used in non-authenticated transactions

aka gave up on trying to keep the account number confidential ... just made knowledge of the account number useless to crooks for account/identity fraud.

misc. related postings
https://www.garlic.com/~lynn/aadsm6.htm#terror7 [FYI] Did Encryption Empower These Terrorists?
https://www.garlic.com/~lynn/aadsm6.htm#terror13 [FYI] Did Encryption Empower These Terrorists?
https://www.garlic.com/~lynn/aadsm8.htm#3dvulner 3D Secure Vulnerabilities?
https://www.garlic.com/~lynn/aadsm8.htm#softpki16 DNSSEC (RE: Software for PKI)
https://www.garlic.com/~lynn/aepay11.htm#66 Confusing Authentication and Identiification?
https://www.garlic.com/~lynn/aadsm14.htm#4 Who's afraid of Mallory Wolf?
https://www.garlic.com/~lynn/aadsm15.htm#27 SSL, client certs, and MITM (was WYTM?)
https://www.garlic.com/~lynn/aadsm16.htm#20 Ousourced Trust (was Re: Difference between TCPA-Hardware and a smart card and something else before
https://www.garlic.com/~lynn/aadsm17.htm#41 Yahoo releases internet standard draft for using DNS as public key server
https://www.garlic.com/~lynn/aadsm18.htm#29 EMV cards as identity cards
https://www.garlic.com/~lynn/aadsm19.htm#39 massive data theft at MasterCard processor
https://www.garlic.com/~lynn/aadsm19.htm#40 massive data theft at MasterCard processor
https://www.garlic.com/~lynn/2000g.html#41 Egghead cracked, MS IIS again
https://www.garlic.com/~lynn/2001f.html#24 Question about credit card number
https://www.garlic.com/~lynn/2002j.html#14 Symmetric-Key Credit Card Protocol on Web Site
https://www.garlic.com/~lynn/2002n.html#14 So how does it work... (public/private key)
https://www.garlic.com/~lynn/2003k.html#66 Digital signature and Digital Certificate
https://www.garlic.com/~lynn/2004b.html#25 Who is the most likely to use PK?
https://www.garlic.com/~lynn/2004i.html#5 New Method for Authenticated Public Key Exchange without Digital Certificates
https://www.garlic.com/~lynn/2004m.html#9 REVIEW: "Biometrics for Network Security", Paul Reid
https://www.garlic.com/~lynn/2005k.html#26 More on garbage
https://www.garlic.com/~lynn/2005l.html#22 The Worth of Verisign's Brand

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

More Phishing scams, still no SSL being used

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: More Phishing scams, still no SSL being used...
Newsgroups: netscape.public.mozilla.crypto
Date: 02 Jul 2005 14:54:36 -0600
Ram A Moskovitz writes:
I tend to agree that the operator of a particular IA or CA is in the best position to offer real time revocation services as they have the primary data feed and would not have to rely on stale information to provide status services. The IETF flavor of OCSP can be profiled to retain the highly desireable cacheing features of the pre OCSP status protocol such that one can tune the length of cache validity to the applications so that in risk transactions (not necessarily just financial I should point out) one can use real-time status while for lower risk transactions one can use a longer cache period and in that way maintain control of the balance between risk and cost - this is a very useful capability when trying to manage risk effeciently and effectively.

so in the FSTC FAST scenario
http://www.fstc.org/

mentioned in the previous post
https://www.garlic.com/~lynn/2005l.html#36 More Phishing scams, still no SSL being used

instead of the end-user sending the merchant a digitally signed x9.59 transaction mapped into standard iso 8583 message network
https://www.garlic.com/~lynn/8583flow.htm

which the relying party then sends off and gets an answer back from the authoritative agency (in the case of financial transaction, whether the merchant will be paid or not) ... a very similarly formated transaction of the same size and shape is sent off to ask any of possibly dozens of questions.

furthermore, there is NO attached redundant and superfluous digital certificate that results in two orders of magnitude payload bloat.

another way of looking at it ... is rather than having a large PKI infrastructure targeted at efficiently providing information in a no-value and/or offline environment ... and then layering the overhead of an online transaction infrastructure over it .... there is just the overhead of the online transaction infrastructure.

So the FAST scenario has at least all the transaction efficiencies of OCSP ... w/o any of the heavy duty, extraneous, redundant and superfluous burden of PKIs and digital certificates.

The other way of looking at it ... was that OCSP was trying to emulate the online transaction efficiencies of FAST, but trying to maintain the facade that the stale, static, redundant and superfluous PKI digital certificates were in anyway useful (for such an online environment).

To meet that requirement (maintaining the fiction that digital certificates were useful in such environments), OCSP was limiting itself to a transaction about whether the information in a specific stale, static, redundant and superfluous digital certificate was still valid. The FAST scenario just did a highly efficient, straight-through processing, digitally signed transaction to get a reply about the actual information (somewhat riding the existing rails that provides highly efficient straight through processing for performing payment transaction). If OCSP starting expanding its horizion and asking real live questions (aka turn into something more akin to FAST) ... then it would become more readily apparent that the stale, static, redundant and superfluous digital certificates weren't serving any useful purpose.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

IBM/Watson autobiography--thoughts on?

Refed: **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: IBM/Watson autobiography--thoughts on?
Newsgroups: alt.folklore.computers
Date: 03 Jul 2005 20:45:08 -0600
fairwater@gmail.com (Derek Lyons) writes:
ROTFLMAO. You really do live in a fantasy world. Farmers didn't know how to make the glass jars that puttin' by depended on. (Just one of hundreds of items they didn't make, didn't know how to make, didn't barter for - and required cash flow to obtain from sources off the farm.)

farmer's co-op through out the midwest did a lot. typical small farmer town in the 50s ... couple hundred people, 2-3 churches, co-op gas station, co-op grainary, co-op grocery store, conoco gas station, rexall drug store, maybe an IGA grocery store, school, post office and a dozen bars. during the winter ... some of the school children from outlying farms might board in town.

travelling combines started the season early in texas and moved north as the wheat ripen. (at least back then) they would harvest wheat for 2-3 bushels per acre, haul it into farmer's co-op. the farmer's co-op carried farmers on credit all year ... and if there was enuf wheat left over after the combine crew got its cut ... they could settle the bills. this is mid-west dry land wheat, a really good year might yield 6 bushels/acre (before the combine crew cut) ... not the famous stuff from eastern washington where they've posted records of 90-100 bushels/acre.

doing quicky web search ... there is short mention of the traveling combines that started in texas and moved north to montana ... see bottom of this page ... harvest brigade
http://www.hutchnews.com/past/06-27-2004/adastra/adastra2.html

slightly more detailed description of the harvest route
http://www.skinnerharvesting.com/
http://www.ckfigginsharvesting.com/

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

Safe to transmit (symmetric) key encrypted with itself?

From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: Safe to transmit (symmetric) key encrypted with itself?
Newsgroups: sci.crypt
Date: 04 Jul 2005 08:38:51 -0600
or maybe it is still somewhat at this stage:
http://science.slashdot.org/article.pl?sid=05/07/03/0431212&tid=14

furthermore i believe there have been one or two discovery programs about disciplines mandating what is the correct orthodoxy (which later turned out to be wrong). it is possibly you are comparing two disciplines that are at completely different maturity levels.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

IBM/Watson autobiography--thoughts on?

From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: IBM/Watson autobiography--thoughts on?
Newsgroups: alt.folklore.computers
Date: 04 Jul 2005 09:16:59 -0600
jmfbahciv writes:
The area I grew up in wasn't that remote. As for those glass canning jars: My mother, who is over 80, is still using jars that her mother used. When my mother dies, her daughters will use them. I'm using my grandmother's cast aluminum Dutch oven. She received it as a wedding present probably around 1916 or so.

are any of the glass jars the kind with the glass tops and the wire bale that slips over the lid to hold it down?

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

25% Pageds utilization on 3390-09?

Refed: **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: 25% Pageds utilization on 3390-09?
Newsgroups: bit.listserv.ibm-main
Date: 05 Jul 2005 09:12:58 -0600
martin_packer@ibm-main.lst (Martin Packer) writes:
Note: Both the "25%" and the "1.5x" ROTs are trying to simplify probabilistic things (as always). So, you know in your shop what happens when you fail the ROT. In the former case paging performance tanks. In the latter it's a real bad news day when that 6GB DB2 subsystem dumps. (I've seen the latter happen and it's not pretty - when you don't have the paging space to contain it.)

the idea behind big pages is similar to work on "log structured" file systems ... collect enuf stuff together to do minimum number of large writes and for those writes have to move the arm as little as possible. big pages were original 10 4k pages that fit on a single 3380 track. on page-out, ten pages from the virtual address space were collected together and written to a single track ... the closest available empty track in the direction of the arm motion (basically a moving cursor algorithm that sweeped across the disk surface in consistent direction). the theory was that the area just behind the cursor would be full and the area just ahead of the cursor would be nearly empty (requiring minimum arm motion to perform the write).

the difference between paging and log structured file system ... is that the page data tended to be ephemeral ... i.e. the data on disk was discarded when a big page was fetched back to memory (on page fault). a log structured file system involved persistant data and would periodically compact scattered regions on disk.

the issue was that over a 10-15 year period, relative system disk access performance had declined by a factor of ten times (that is memory and cpu got 50 times bigger & faster, while disk access thruput only got 3-5 times faster). The issue in the 3081/3380 time-frame was that while the 3380 disk access performance had only increased by a factor of maybe four times ... 3380 disk transfer speed had increased by a factor of ten times (and real memory had increased by factor of maybe 50 times). The result was that there was a relative over abundance of disk transfer capacity and real memory compared to disk access thruput (and total disk space capacity had also enormously increase).

The issue was how to trade-off the enormous amount of disk space capacity, and the relatively large amounts of disk transfer capacity and real storage against the scarce bottleneck disk arm access resource.

big-pages with moving cursor ... did ten 4k page writes for every arm access and attempted to drastically minimize the expected arm travel distance (compared to single page at a time transfer). with very sparse allocation (trading disk space resources against disk arm access scarcity), multiple big page writes might be performed on the same cylinder w/o arm motion.

on any 4k page-fault ... all (ten 4k pages) pf a big page was brought back into memory. compared to a 4k page at a time page fault strategy, it might bring in 2-3 more pages than the application would eventually need. however, it would likely avoid at least 5-6 page transfers compared to a 4k page at a time strategy (since the pages had already been brought in). The trade-off was that on fetch, a big page might unnecessarily transfer 2-3 pages (disk transfer resource) and occupty 2-3 pages of additional real memory (real memory resource) at the savings of 5-6 arm accesses. hopefully the number of such additional page transfers were minimized ... but even if they weren't, reducing ten arm access to one more than offset any 8-12k increase in amount of data transferred and/or 8-12k increase in real storage needed.

another trade-off was that most single-page algorithms tended to preserve home position when a page was read into memory. while the page was in memory ... a copy existed on both disk and in real storage. when the page was selected for replacement, if the page hadn't been changed during the most recent stay ... the page write could be avoided (just keeping the existing copy on disk). in the big page scenario ... the sense of existing copy on disk was discarded (in part because its arm position might not bear any relationship to the arm position when the page was to be removed from storage). As a result, the number of bytes written out associated with page replacements somewhat increased (compared to single page at a time with home position) at a savings of the number of disk arm accesses.

at the introduction of 3380 there was some discussion about the enormous increase in 3380 disk space capacity. if you completely filled a 3380, at 4k bytes transfer/access .... that accesses per second per byte was lower than a 3350 i.e. the 3380 arm access per second was higher than 3350 arm access per second ... but the increase in 3380 disk space capacity was even larger.

There was a discussion at SCIDS regarding recommendations to datacenter management that 3380s only be filled to less than 80 percent capacity to maintain thruput equivalence (between full 3350 and full 3380). The problem was that datacenter management tended to account for disk space cylinders but not necessarily overall system thruput (based on bottleneck of available disk accesses per second). The recommendata was a "fast" 3380 should be announced ... that had a controller microcode load that reduced the number of available 3380 cylinders ... and that the price of the "fast" 3380 should be higher than the price of a regular 3380 (with no reduction in cylinders). The SCIDS discussion was that this was possibly the only way to convince most datacenter management of the benefits of managing the disk access per second per byte throughput bottleneck (make them pay more for a hardware enforced feature that they could otherwise achive thru simple administrative policy).

when I first started making statements and writing about the drastic deline in relative system thruput of disk arm accesses, GPD management assigned their performance modeling group to refute the claims. after some period, they came back with the conclusion that I had slightly understated the reduction in disk relative system thruput. this was turned around and made a SHARE presentation on how to optimize for the (unique?) 3380 performance characteristics.

misc. past posts about observing that relative system disk access thruput had declined by at least ten times over a period of years.
https://www.garlic.com/~lynn/93.html#31 Big I/O or Kicking the Mainframe out the Door
https://www.garlic.com/~lynn/94.html#43 Bloat, elegance, simplicity and other irrelevant concepts
https://www.garlic.com/~lynn/94.html#55 How Do the Old Mainframes Compare to Today's Micros?
https://www.garlic.com/~lynn/95.html#10 Virtual Memory (A return to the past?)
https://www.garlic.com/~lynn/98.html#46 The god old days(???)
https://www.garlic.com/~lynn/99.html#4 IBM S/360
https://www.garlic.com/~lynn/99.html#112 OS/360 names and error codes (was: Humorous and/or Interesting Opcodes)
https://www.garlic.com/~lynn/2001d.html#66 Pentium 4 Prefetch engine?
https://www.garlic.com/~lynn/2001f.html#62 any 70's era supercomputers that ran as slow as today's supercomputers?
https://www.garlic.com/~lynn/2001f.html#68 Q: Merced a flop or not?
https://www.garlic.com/~lynn/2001l.html#40 MVS History (all parts)
https://www.garlic.com/~lynn/2001l.html#61 MVS History (all parts)
https://www.garlic.com/~lynn/2001m.html#23 Smallest Storage Capacity Hard Disk?
https://www.garlic.com/~lynn/2002b.html#11 Microcode? (& index searching)
https://www.garlic.com/~lynn/2002b.html#20 index searching
https://www.garlic.com/~lynn/2002e.html#8 What are some impressive page rates?
https://www.garlic.com/~lynn/2002e.html#9 What are some impressive page rates?
https://www.garlic.com/~lynn/2002.html#5 index searching
https://www.garlic.com/~lynn/2002i.html#16 AS/400 and MVS - clarification please
https://www.garlic.com/~lynn/2003i.html#33 Fix the shuttle or fly it unmanned
https://www.garlic.com/~lynn/2004n.html#22 Shipwrecks
https://www.garlic.com/~lynn/2004p.html#39 100% CPU is not always bad
https://www.garlic.com/~lynn/2005h.html#13 Today's mainframe--anything to new?
https://www.garlic.com/~lynn/2005k.html#53 Performance and Capacity Planning

misc. past big page posts
https://www.garlic.com/~lynn/2001k.html#60 Defrag in linux? - Newbie question
https://www.garlic.com/~lynn/2002b.html#20 index searching
https://www.garlic.com/~lynn/2002c.html#29 Page size (was: VAX, M68K complex instructions)
https://www.garlic.com/~lynn/2002c.html#48 Swapper was Re: History of Login Names
https://www.garlic.com/~lynn/2002e.html#8 What are some impressive page rates?
https://www.garlic.com/~lynn/2002e.html#11 What are some impressive page rates?
https://www.garlic.com/~lynn/2002f.html#20 Blade architectures
https://www.garlic.com/~lynn/2002l.html#36 Do any architectures use instruction count instead of timer
https://www.garlic.com/~lynn/2002m.html#4 Handling variable page sizes?
https://www.garlic.com/~lynn/2003b.html#69 Disk drives as commodities. Was Re: Yamhill
https://www.garlic.com/~lynn/2003d.html#21 PDP10 and RISC
https://www.garlic.com/~lynn/2003f.html#5 Alpha performance, why?
https://www.garlic.com/~lynn/2003f.html#9 Alpha performance, why?
https://www.garlic.com/~lynn/2003f.html#16 Alpha performance, why?
https://www.garlic.com/~lynn/2003f.html#48 Alpha performance, why?
https://www.garlic.com/~lynn/2003g.html#12 Page Table - per OS/Process
https://www.garlic.com/~lynn/2003o.html#61 1teraflops cell processor possible?
https://www.garlic.com/~lynn/2003o.html#62 1teraflops cell processor possible?
https://www.garlic.com/~lynn/2004e.html#16 Paging query - progress
https://www.garlic.com/~lynn/2004.html#13 Holee shit! 30 years ago!
https://www.garlic.com/~lynn/2004n.html#22 Shipwrecks
https://www.garlic.com/~lynn/2004p.html#39 100% CPU is not always bad
https://www.garlic.com/~lynn/2005h.html#15 Exceptions at basic block boundaries
https://www.garlic.com/~lynn/2005j.html#51 Q ALLOC PAGE vs. CP Q ALLOC vs ESAMAP

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/

More Phishing scams, still no SSL being used

Refed: **, - **, - **, - **, - **
From: Anne & Lynn Wheeler <lynn@garlic.com>
Subject: Re: More Phishing scams, still no SSL being used...
Newsgroups: netscape.public.mozilla.crypto
Date: 05 Jul 2005 10:26:11 -0600
pgut001@cs.auckland.ac.nz (Peter Gutmann) writes:
I actually suggested updating OCSP to provide true live status information (akin to the accepted/declined response in CC transactions) a few years ago, but this was violently rejected by PKIX members because that's Not How X.509 Works. One list member in particular became almost hysterical over the suggestion, going so far as to privately petition the WG chair to have the proposal killed (it's public record on the PKIX mailing list, although some of the more extreme hysterics occurred in private mail).

which then is essentiall the FSTC/FAST
http://www.fstc.org/

from the mid-90s .... but it makes having the digital certificate redundant and superfluous (aka you don't need to have a digital certificate to do a real live transaction) ... which i have repeatedly commented to the PKIX and OCSP factions (and may have been part of the reason for their violent reaction to you suggestion).

in '98 i was on a "PKI" panel at nissc conference with four other people ... three representing the major CAs and one other person.

the people representing three major CAs (CTO typically) talked about how hard everybody has heard that PKIs were ... and they were here to tell you that it is much, much simpler than you have heard.

I then talked about the majority of the business processes in the world can be upgraded to digital signature authentication w/o requiring digital certificates.

the fifth person then talked about being responsible for the largest and longest deployed PKI operation ... and people may have heard about how hard PKIs were, and they were here to tell you that PKIs are actually much, much harder than anything you have heard.

--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/


previous, next, index - home