From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: Book on computer architecture for beginners Newsgroups: comp.arch,alt.folklore.computers Date: 22 Jun 2005 11:45:27 -0600keith writes:
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 22 Jun 2005 14:09:00 -0600"Anders Rundgren" writes:
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3
in the year we worked with them ... they moved and changed their name. trivia question ... who had owned the rights to their new name?
as part of the effort on doing the thing called a payment gateway and allowing servers to do payment transactions ... we also had to perform business due diligence on most of these operations that produced these things called SSL domain name certificates. At the time we coined the term certificate manufacturing ... to differentiate what most of them were doing from this thing called PKI (aka that don't actually really have any operational business process for doing much more than pushing the certificates out the door).
It was also when we coined the term merchant comfort certificates (since it made the relying parties ... aka the consumers ... feel better).
we also originated the comparison between PKI CRLs and the paper booklet invalidation model used by the payment card industry in the 60s. when some number of people would comment that it was time to move payment card transactions into the modern world using digital certificates ... would pointed out to them ... rather than modernizing the activity ... it was regressing the operations by 20-30 years.
Another analogy for certificates is the offline payment card world of the 50s & 60s ... which had to mail out invalid account booklets on a monthly basis ... and then as the number of merchants, card holders and risks increased ... they started going to humongous weekly mailings. At least they had a record of all the relying parties (aka merchants) ... which typical PKI operation has no idea what-so-ever who the relying parties are.
It was sometime after we started pointing out that PKIs really had a business model oriented at offline business environment ... which would result in regressing many business operations decades if it was force fit on them ... that you saw OCSP come on the scene.
OCSP doesn't actually validate any information ... it is just there to validate whether any information that might be in a certificate is actually still valid. The CRL model is known to not scale ... as found out in the payment card industry going into the 70s.
PKIs imply the administration and management of the trust information. In the offline certificate model ... either they have to have a list of all possible relying parties and regularly push invalidation lists out to them ... or they provide an online service which allows relying parties to check for still valid. However, as you point out that the PKI administration and management of any kind doesn't really scale ... which resulted in actual deployments being simple certificate manufacturing (instead of real PKI).
The payment card industry also demonstrated the lack of scaling of the certificate-like offline model scalling model in the 70s when they converted to an online model for the actual information.
Part of the problem with the online OCSP model is that has all the overhead of an online model with all the downside of the offline implementation aka not providing the relying party access to real online, timely actual information ... things like timely aggregation information (current account balance) or timely sequences of events (saw for fraud detection).
part of the viability for no/low-value market segment is to stick with simple certificate manufacturing and don't actually try to manage and administrate the associated information trust.
random past certificate manufacturing posts
https://www.garlic.com/~lynn/aepay2.htm#fed Federal CP model and financial transactions
https://www.garlic.com/~lynn/aepay2.htm#cadis disaster recovery cross-posting
https://www.garlic.com/~lynn/aepay3.htm#votec (my) long winded observations regarding X9.59 & XML, encryption and certificates
https://www.garlic.com/~lynn/aadsm2.htm#scale Scale (and the SRV record)
https://www.garlic.com/~lynn/aadsm2.htm#inetpki A PKI for the Internet (was RE: Scale (and the SRV
https://www.garlic.com/~lynn/aadsm3.htm#kiss7 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm5.htm#pkimort2 problem with the death of X.509 PKI
https://www.garlic.com/~lynn/aadsm5.htm#faith faith-based security and kinds of trust
https://www.garlic.com/~lynn/aadsm8.htm#softpki6 Software for PKI
https://www.garlic.com/~lynn/aadsm8.htm#softpki10 Software for PKI
https://www.garlic.com/~lynn/aadsm8.htm#softpki14 DNSSEC (RE: Software for PKI)
https://www.garlic.com/~lynn/aadsm8.htm#softpki20 DNSSEC (RE: Software for PKI)
https://www.garlic.com/~lynn/aadsm9.htm#cfppki5 CFP: PKI research workshop
https://www.garlic.com/~lynn/aadsmore.htm#client4 Client-side revocation checking capability
https://www.garlic.com/~lynn/aepay10.htm#81 SSL certs & baby steps
https://www.garlic.com/~lynn/aepay10.htm#82 SSL certs & baby steps (addenda)
https://www.garlic.com/~lynn/aadsm11.htm#34 ALARMED ... Only Mostly Dead ... RIP PKI
https://www.garlic.com/~lynn/aadsm11.htm#39 ALARMED ... Only Mostly Dead ... RIP PKI .. addenda
https://www.garlic.com/~lynn/aadsm13.htm#35 How effective is open source crypto? (bad form)
https://www.garlic.com/~lynn/aadsm13.htm#37 How effective is open source crypto?
https://www.garlic.com/~lynn/aadsm14.htm#19 Payments as an answer to spam (addenda)
https://www.garlic.com/~lynn/aadsm14.htm#37 Keyservers and Spam
https://www.garlic.com/~lynn/aadsm15.htm#0 invoicing with PKI
https://www.garlic.com/~lynn/aadsm19.htm#13 What happened with the session fixation bug?
https://www.garlic.com/~lynn/98.html#0 Account Authority Digital Signature model
https://www.garlic.com/~lynn/2000.html#40 "Trusted" CA - Oxymoron?
https://www.garlic.com/~lynn/2001d.html#7 Invalid certificate on 'security' site.
https://www.garlic.com/~lynn/2001d.html#16 Verisign and Microsoft - oops
https://www.garlic.com/~lynn/2001d.html#20 What is PKI?
https://www.garlic.com/~lynn/2001g.html#2 Root certificates
https://www.garlic.com/~lynn/2001g.html#68 PKI/Digital signature doesn't work
https://www.garlic.com/~lynn/2001h.html#0 PKI/Digital signature doesn't work
https://www.garlic.com/~lynn/2001j.html#8 PKI (Public Key Infrastructure)
https://www.garlic.com/~lynn/2003.html#41 InfiniBand Group Sharply, Evenly Divided
https://www.garlic.com/~lynn/2003l.html#36 Proposal for a new PKI model (At least I hope it's new)
https://www.garlic.com/~lynn/2003l.html#45 Proposal for a new PKI model (At least I hope it's new)
https://www.garlic.com/~lynn/2003l.html#46 Proposal for a new PKI model (At least I hope it's new)
https://www.garlic.com/~lynn/2004m.html#12 How can I act as a Certificate Authority (CA) with openssl ??
random past comfort certificate postings:
https://www.garlic.com/~lynn/aadsm2.htm#mcomfort Human Nature
https://www.garlic.com/~lynn/aadsm2.htm#mcomf3 Human Nature
https://www.garlic.com/~lynn/aadsm2.htm#useire2 U.S. & Ireland use digital signature
https://www.garlic.com/~lynn/aadsm3.htm#kiss5 Common misconceptions, was Re: KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss7 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsmail.htm#comfort AADS & X9.59 performance and algorithm key sizes
https://www.garlic.com/~lynn/aepay4.htm#comcert Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert2 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert3 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert4 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert5 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert6 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert7 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert8 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert9 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert10 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert11 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert12 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert13 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert14 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert15 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert16 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay4.htm#comcert17 Merchant Comfort Certificates
https://www.garlic.com/~lynn/aepay6.htm#dspki use of digital signatures and PKI
https://www.garlic.com/~lynn/aepay10.htm#80 Invisible Ink, E-signatures slow to broadly catch on (addenda)
https://www.garlic.com/~lynn/2000c.html#32 Request for review of "secure" storage scheme
https://www.garlic.com/~lynn/2001c.html#62 SSL weaknesses
https://www.garlic.com/~lynn/2003l.html#43 Proposal for a new PKI model (At least I hope it's new)
https://www.garlic.com/~lynn/2004b.html#39 SSL certificates
https://www.garlic.com/~lynn/2004c.html#43 why and how VeriSign, thawte became a trusted CA?
https://www.garlic.com/~lynn/2004i.html#4 New Method for Authenticated Public Key Exchange without Digital Certificates
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 22 Jun 2005 14:57:15 -0600Anne & Lynn Wheeler writes:
misc. refs:
https://www.garlic.com/~lynn/2005i.html#42 Development as Configuration
https://www.garlic.com/~lynn/2005i.html#43 Development as Configuration
https://www.garlic.com/~lynn/2005i.html#44 SqlServerCE and SOA - an architecture question
https://www.garlic.com/~lynn/2005i.html#48 defeating firewalls made easy
https://www.garlic.com/~lynn/2005k.html#2 Ancient history
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 22 Jun 2005 15:13:41 -0600"Anders Rundgren" writes:
the problem with OSCP services is that it supposedly just says yes/no as to whether the stale, static certificate information is still applicable or not.
as mentioned ... this has all the overhead of having an online service w/o any of the benefits.
the payment infrastructure moved out of this offline (certificate-like), archaic design in the 70s with online authentication and authorization with timely online access to the actual, real information ... like aggregated information of sequences of operations. This resulted in things like support for fraud detection patterns and current account balance. the current account balance represents the starting value (which you might or not might consider including in a stale, static, redundant and superfluous certificate?), in addtion to the aggregation of all the ongoing operations updating the current account balance with subtractions and additions (say issue a brand new stale, static, redundant and superfluous certificate every time there is an account balance update, and then spray it all over the world to every possible and/or potential relying party).
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: SHARE 50 years Newsgroups: alt.folklore.computers Date: 23 Jun 2005 06:06:01 -0600
Fifty years of sharing open source
http://business.newsforge.com/business/05/06/15/166253.shtml?tid=35&tid=18
and some comments
http://business.newsforge.com/comments.pl?sid=47512&mode=flat&commentsort=0&op=Change
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: Book on computer architecture for beginners Newsgroups: comp.arch,alt.folklore.computers Date: 23 Jun 2005 06:15:02 -0600Alex McDonald writes:
found this listing for nylon ribbons for 1403
http://www.cleansweepsupply.com/pages/item-lex0457937.html
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: SHARE 50 years? Newsgroups: bit.listserv.ibm-main Date: 23 Jun 2005 07:21:19 -0600
Fifty years of sharing open source
http://business.newsforge.com/business/05/06/15/166253.shtml?tid=35&tid=18
and some comments
http://business.newsforge.com/comments.pl?sid=47512&mode=flat&commentsort=0&op=Change
ref to presentation made at fall 68 share meeting
https://www.garlic.com/~lynn/94.html#18
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: lynn@garlic.com Newsgroups: microsoft.public.dotnet.framework.aspnet.security Subject: Re: Signing and bundling data using certificates Date: Thu, 23 Jun 2005 10:24:14 -0700Alan Fisher wrote:
there is a business process called public/private key ... where one key is made public (public key) and the other key is kept confidentential and is never divulged (private key).
there is an additional business process called
digital signature
authentication ... where a hash of some data is made and then encoded
with private key. the corresponding public key can be used to decode
the digital signature ... and then compare the decoded digital
signature with a recomputed hash of the message. If the recomputed
hash and the decoded digital signature are the same, then the
recipient knows that 1) the message hasn't been modified and 2)
authenticates the originator of the message.
in standard business practice ... somebody registers their public key
with destinations and/or relying parties ... in much the same way they
might register a pin, password, SSN#, mother's maiden name, birth
date, and/or any other authentication information. The advantage of
registering a public key over some sort of static, shared-secret
... is that a public key can only be used to authenticate digital
signatures .... it can't be used for impersonation (as part of
generating a digital signature).
https://www.garlic.com/~lynn/subpubkey.html#certless
On-file, static, shared-secret authentication information can not only
be used for authentication ... but also impersonation.
https://www.garlic.com/~lynn/subintegrity.html#secrets
Digital certificates are a business process that addresses an offline email scenario from the early 80s ... where the recipient dials up their local (electronic) post office, exchanges email, hangs up ... and then is possibly faced with authenticating some first time communication from a total stranger (and had no recourse to either local information and/or online information for obtaining the necessary information). It is somewhat analogous to the "letters of credit" used in the sailing ship days.
A trusted party "binds" the public key with some other information into a "digital certificate" and then digital signs the package called a digital certificate. The definition of a "trusted party" is that recipients have the public key of the "tursted party" is some local trusted public key repository (for instance browsers are shipped with a list of trusted party public keys in an internal trusted public key repository).
The originator creates a message or document of some sort, digital signs the information and then packages up the 1) document, 2) digital signature, and 3) digital certificate (containing some binding of their public key to other information)
and transmits it.
The recipient/relying-party eventually gets the package composed of the three pices. The recipient looks up the trusted party's public key in their trusted public key repository, and validates the digital signature on the enclosed digital certificate. If the digital certificate is valid, they then check the "bound" information in the digital certificate to see if it relates to anything at all they are interested in. If so, then they can take the sender's public key (included in the digital certificate) and validate the digital signature on the message. If that all seems to be valid ... they then make certain assumptions about the content of the actual message.
In normal business operations ... where there is prior relationship between the sender and the receiver ... the receiver will tend to already have authentication information about the sender in a local trusted (public key) repository (and not have to resort to trust redirection thru the use of trusted party public keys and digital certificates).
Another scenario is that in the early 90s, there were x.509 identity digital certificates where the trusted parties (or certification authorities ... i.e. CAs) were looking at grossly overloading the "bound" information in the digital certificates with enormous amounts of personal information. This was in part because the CAs didn't have a good idea what future relying parties might need in the way of information about individuals that they were communicating with.
You started to see some retrenchment of this in the mid-90s ... where
institutions were started to realize that x.509 identity digital
certificates grossly overloaded with personal information represented
significant privacy and liability issues. Somewhat as a resort there
was some retrenchment to relying-party-only certificates
https://www.garlic.com/~lynn/subpubkey.html#rpo
which contained little more information than the individual's public key and some sort of account number or other database index. The actual database contained the real information. However, it is trivial to show that such replying-party-only certificates not only violate the original purpose of digital certificates, but are also redundant and superfluous ... aka the relying party registers the indivuals public key in their trusted repository along with all of the individual's other information. Since all of the individual's information (including their public key) is already in a trusted repository at the relying party, having an individual repeatedly transmit a digital certificate containing a small, stale, static, subset of the same information is redundant and superfluous.
In some of the scenarios involving relying-party-only certificates from the mid-90s it was even worse than redundant and superfluous. One of the scenarios involved specification for digitally signed payment transactions with appended relying-party-only digital certificate. Typical payment transactions are on the order of 60-80 bytes. The typical erlying-party-only digital certicates involved 4k-12k bytes. Not only were the relying-party-only stale, static digital certificates, redundant and superfluous, they also would represent a factor of one hundred times payload bloat for the payment transaction network (increasing the size of payment transaction by one hundred times for redundant and superfluous stale, static information)
From: Anne & Lynn Wheeler <lynn@garlic.com> Newsgroups: sci.crypt.research Subject: Re: derive key from password Date: 24 Jun 2005 10:07:32 -0600machiel@braindamage.nl (machiel) writes:
there is also some work on longer term derived key material ... where rather than doing a unique derived key per transaction ... there are long term derived keys. in the DUKPT case, clear-text information from the transaction is part of the process deriving the key. The longer term derived keys tend to use some sort of account number. You might find such implementations in transit systems. There is a master key for the whole infrastructure ... and each transit token then has a unique account number with an associated derived key. The transit system may store data in each token using the token-specific derived key. brute force on a token specific key ... doesn't put the whole infrastructure at risk.
some discussion of attack against RFC 2289 one-time password system
(uses iterative hashes of a passphrase)
https://www.garlic.com/~lynn/2003n.html#1 public key vs passwd authentication?
https://www.garlic.com/~lynn/2003n.html#2 public key vs passwd authentication?
https://www.garlic.com/~lynn/2003n.html#3 public key vs passwd authentication?
https://www.garlic.com/~lynn/2005i.html#50 XOR passphrase with a constant
from my rfc index
https://www.garlic.com/~lynn/rfcietff.htm
https://www.garlic.com/~lynn/rfcidx5.htm#1760
1760 I
The S/KEY One-Time Password System, Haller N., 1995/02/15 (12pp)
(.txt=31124) (Refs 1320, 1704) (Ref'ed By 1938, 2222, 2229, 2289,
2945, 4082)
in the RFC summary, clicking on the ".txt=nnnn" field retrieves the
actual RFC.
https://www.garlic.com/~lynn/rfcidx6.htm#1938
1938 -
A One-Time Password System, Haller N., Metz C., 1996/05/14 (18pp)
(.txt=44844) (Obsoleted by 2289) (Refs 1320, 1321, 1704, 1760)
(Ref'ed By 2243, 2284, 2828)
https://www.garlic.com/~lynn/rfcidx7.htm#2289
2289 S
A One-Time Password System, Haller N., Metz C., Nesser P., Straw
M., 1998/02/26 (25pp) (.txt=56495) (STD-61) (Obsoletes 1938) (Refs
1320, 1321, 1704, 1760, 1825, 1826, 1827) (Ref'ed By 2444, 2808,
3552, 3631, 3748, 3888) (ONE-PASS)
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: Jack Kilby dead Newsgroups: alt.folklore.computers Date: 24 Jun 2005 12:28:18 -0600hancock4 writes:
i've also seen what i consider more science, technology and engineering go into the manufacturing of product ... than might goe into the product itself. literature tends to have lots of stuff about various product technologies ... but not necessarily a whole lot about product manufacturing technology.
i once worked on a product in the mid-70s that was canceled (before announce) because it only showed $9B revenue over five years and was below the minimum threshold requirement of $10b over five years.
...
or have multi-megapixel CCDs. in the mid-80s, I got asked to spend time on what was then called Berkeley 10m (now called Keck 10m .. and they have built a second one). at the time, as part of the effort, they were testing 200x200 (40k pixels) ccd array at lick observatory ... and some talk about maybe being able to get a 400x400 (160k pixels) for testing. there was an industry rumor that possibly spielberg might have a 2kx2k ccd that he was testing.
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: who invented CONFIG/SYS? Newsgroups: comp.sys.tandy,alt.folklore.computers Date: 24 Jun 2005 14:26:49 -0600noone writes:
for some tandy topic drift, the original document formater/processor
done on cms at the science center
https://www.garlic.com/~lynn/subtopic.html#545tech
was called script and used runoff-like "dot" commands. in '69, "G",
"M", and "L" invented GML at the science center
https://www.garlic.com/~lynn/submain.html#sgml
(and of course they had to come up with a name that matched their initials). it was later standardized in iso as SGML and later begat HTML, XML, FSML, SAML, etc.
univ. of waterloo did a cms script clone that was in use on cms at
cern ... and is the evolutionary path to html .. recent reference
in afc to the UofW and cern connection:
https://www.garlic.com/~lynn/2005k.html#58 Book on computer architecture for beginners
an IBM SE (system engineer) out of the LA branch office, in the late
70s, did a cms script clone and sold it on tandy machines (some of
the references seem to indicate it is still available):
https://www.garlic.com/~lynn/2000e.html#0 What good and old text formatter are there ?
https://www.garlic.com/~lynn/2000e.html#20 Is Al Gore The Father of the Internet?^
https://www.garlic.com/~lynn/2002b.html#46 ... the need for a Museum of Computer Software
https://www.garlic.com/~lynn/2002h.html#73 Where did text file line ending characters begin?
https://www.garlic.com/~lynn/2002p.html#54 Newbie: Two quesions about mainframes
https://www.garlic.com/~lynn/2003.html#40 InfiniBand Group Sharply, Evenly Divided
https://www.garlic.com/~lynn/2004o.html#5 Integer types for 128-bit addressing
https://www.garlic.com/~lynn/2005.html#46 8086 memory space
for total topic drift ... current w3c offices are only a couple blocks from the old science center location at 545 tech. sq.
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 25 Jun 2005 07:48:28 -0600"Anders Rundgren" writes:
1) stale, static, year old information (say about whether a financial account may or may not have existed)
and
2) real-time response based on real-time and aggregated information whether they were being paid.
... would relying parties prefer to have stale, static year old information ... or would they prefer to have a real time answer whether or not they were being paid.
The issue is that OCSP goes to all the trouble to have a real-time information responding yes/no to whether the stale, static information was still current ... but doesn't provide a yes/no response to whether the relying party was actually being paid.
The contention is that going to all the trouble of having a real-time operation ... the yes/no response to being paid or not ... is of significantly more value to a relying party than whether or not some stale, static information was still valid.
the analogy is that you have a strip mall ... that has a bunch of retail stores. there are appliance operation, a dry goods operation and an identification operation. you go into the appliance operation and buy a appliance and present a card ... that card initiates an online transaction, which checks you financial worth and recent transactions and a relying party returns to the merchant a guarantee that they will be (and possibly already have been) paid.
you then go into the identity operation and present a card ... the digital certificate is retrieved by the operation ... it does an OCSP to check if the certificate is still valid and then they verify a digital signature operation. then you walk out of the store (random acts of gratuitous identification).
the issue, of course, is that very few verification or identification things are done just for the sake of doing them .. they are almost always done within the context of performing some other operation. the assertion has always been that the verification of stale, static information is only useful to the relying party whent they have no recourse to more valuable, real-time information (and/or recent stale, staic paradigm has tried to move into the no-value market niche, where the no-value operation can't justify the cost of real-time operation)
you very seldom have acts of gratuitous identification occurring ... they are occurring within some context. furthermore there are huge number of operations where the issue of identification is superfluous to the objective of the operation ... which may be primarily the exchange of value (as the object of the operation) and identification is truely redundant and superfluous (as can be demonstrated when anonomous cash can be used in lieu of financial institutional based exchange of value).
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 25 Jun 2005 08:43:24 -0600"Anders Rundgren" writes:
It would appear to be a descaling of PKI ... since there is only "one cert per bank".
It is also has some number of operations that could be considered antithetical to the PKI design point. The consumer bank and the consumer have a predefined relationship. It is possible for the consumer bank to ship their public key for direct installation in the consumer's trusted public key repository.
The PKI design point has trusted third party CAs ... installing
their public key in the consumer's trusted public key repository
... the original model from the original electronic commerce
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3
with this thing called SSL.
The CA's then digitally signed digital certificates for operations that the consumer had no prior relationship with. The consumer could validate the digital certificates with the "CA" public keys on-file in their local trusted public key repository (possibly manufactured and integrated into their application, like a browser).
For predefined relationship between a consumer and their financial institution ... they can exchange public keys and store them in their respective trusted public key repositories (a financial institution can provide the consumer some automation that assists in such an operation).
PKI would appear to actually make the existing infrastructure less secure ... rather than the consumer directly trusting their financial institution ... the consumer would rely on a TTP CA to provide all their trust about their own consumer financial institution.
In the mid-90s there was work done on PKI for payment transactions.
One of the things learned from the early 90s, x.509 identity
certificates ... was that they appeared to represent significant
privacy and liability issues. As a result many institutions retrenched
to relying-party-only digital certificates ...
https://www.garlic.com/~lynn/subpubkey.html#rpo
that contained little more than some sort of database lookup value (like an account number) and a public key. however, it was trivial to demonstrate such certificates were redundant and superfluous. One possible reason for the ease in demonstrating that such stale, static certificates were redundant and superfluous was that they appeared to totally violate the basis PKI design point, aka requiring a independent, trusted third party to establish trust between two entities that never previously had any interaction.
For two parties that have pre-existing relationship, it is possible for them to directly exchange public keys and store them in their respective trusted public key repositories .... and not have to rely on a trusted third party to tell them whether they should trust each other. In the case where a consumer's financial institution is the only entity with a public/private key pair ... it is possible for the consumer to obtain the public key of their trusted financial institution ... not needed to rely on some independent third party to provide them with trust.
The other issue from the mid-90s in the PKI-oriented payment transaction specification ... was that besides using redundant and superfluous stale, static certificates ... they also represented enormous payload bloat for the financial infrastructure. The typical iso 8583 payment transaction is on the order of 60-80 bytes. The RPO-certificate overhead for the operation was on the order of 4k to 12k bytes. The stale, static, redundant and superfluous digital certificate overhead represented an enormous, one hundred times increase in payload bloat.
Another question ... are you saying that the complete transaction goes via this new path ... or does the existing real-time iso 8583 transaction have to be performed in addition to this new real-time function (also being performed, at least doubling the number and overhead for real-time operations).
The existing iso 8583 operations goes as straight-through processing in a single real-time round trip. Does the introduction of these new operations improve on the efficiency of that existing single round-trip, straight through processing?
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 25 Jun 2005 09:11:27 -0600with respect to iso 8583 payment network trust and single round-trip, straight thru processing.
part of the issue is that a PKI is redundant and superfluous since they don't need to rely on a trusted third party to provide trust between anonomous strangers that have never before met. in some sense, the pre-existing relationship and pre-existing trust allows for more efficient, single round-trip, straight through processing ... w/o having to go through a trust discovery process for every transactions (authentication should be sufficient).
in the normal operation, a merchant financial institution has a contractual relationship with merchants ... for which the merchant financial institution also takes some amount of financial liability. one of the well-used examples is the airline industry, both loved and somewhat feared by merchant financial institutions. There are a lot of high value transactions ... but there is also the prospect of the airline going bankrupt ... in the past this has represented something like $20m (or more) in outstanding airline tickets that the merchant financial institution had to make good on.
In a manner similar to the merchant financial institution and the merchant, there is also a pre-existing contractual relationship between a consumer and the consumer's financial institution (with the consumer's financail institution also accepting liability for their consumers). Again, no trusted third party PKI is required to establish trust on every operation that goes on between the consumer and the consumer's financial institution.
Over both the merchant financial institutions and the consumer financial institutions are the associations ... where there are pre-existing contractual relationships between the associations and the financial institutions. Again, there is no requirement for a trusted third party PKI to provide for a trust relationship on every transaction between the financial institutions and the associations.
A trusted third party PKI has no role in such an environment because there are pre-existing contractual, trust relationships already in place ... making a trusted third party PKI redundant and superfluous.
So not only is there an end-to-end contractual trust chain that follow from the merchant, to the merchant financial institution, to the associations, to the consumer financial institution, to the consumer ... this pre-existing end-to-end contractual trust chain can be relied upon to improve efficiency so that the whole trust establishment processes doesn't have to be re-executed on every transaction ... allowing for single round-trip straight through processing.
The existing issue ... doesn't have so much to do with establishing trust relationships (the objective of TTP CA & PKIs) but the simple problem of improving the authentication technology (when trust has already been established ... then the operations can rely on simpler authentication events ... rather than having to repeatedly re-establish the basis for identification and trust) ... it has to do with the vulnerability and exploits associated with existing authentication technology in use.
It is possible to simply improve on the integrity of the authentication technology .... w/o having to introduced the complexity and expense of repeatedly having to re-establish trust for every operation.
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 25 Jun 2005 09:34:21 -0600Anne & Lynn Wheeler writes:
one of the reasons that it is unlikely that the real transaction will go via the new path directly from the merchant to the consumer's financial institution (except in the "on-us" scenario where the same financial institution represents both the consumer and the merchant) is that the merchant financial institution has interest in real-time tracking of the merchant activities (aka the merchant financial institution is liable for what the merchant does, in much the same that the consumer financial institution is liable for consumer transactions).
having the merchant substitute direct transaction with the consumer financial institution would cut the merchant financial institution out of the single round-trip, straight-through process path. this might be likely were the consumer financial institution to not only assume liability for the consumer but also for the merchant (as in the "on-us" transaction scenario, aka "on-us" is defined as the situation where the same financial institution represents both the merchant and the consumer in the transaction).
on going posts in this thread:
https://www.garlic.com/~lynn/2005i.html#12 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#13 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#14 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#17 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#21 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#23 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#24 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005i.html#26 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005k.html#60 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005l.html#1 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005l.html#2 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005l.html#3 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005l.html#11 The Worth of Verisign's Brand
https://www.garlic.com/~lynn/2005l.html#12 The Worth of Verisign's Brand
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 25 Jun 2005 09:51:51 -0600oh, and just for the fun of it ... past threads where we've exchanged posts regarding the nature of
1) pre-existing, established contractual trust relationships
vis-a-vis
2) dynamically establishment of trust on every transaction with the aid of trusted third party PKI certification authority
and in the case of pre-existing, contractual trust relationships,
whether or not PKI certification authorities were redundant and
superfluous for also establishment of trust relationship
https://www.garlic.com/~lynn/aepay11.htm#70 Confusing Authentication and Identiification? (addenda)
https://www.garlic.com/~lynn/aepay12.htm#1 Confusing business process, payment, authentication and identification
https://www.garlic.com/~lynn/aadsm12.htm#22 draft-ietf-pkix-warranty-ext-01
https://www.garlic.com/~lynn/aadsm12.htm#41 I-D ACTION:draft-ietf-pkix-sim-00.txt
https://www.garlic.com/~lynn/aadsm12.htm#45 draft-ietf-pkix-warranty-extn-01.txt
https://www.garlic.com/~lynn/aadsm12.htm#48 draft-ietf-pkix-warranty-extn-01.txt
https://www.garlic.com/~lynn/aadsm12.htm#54 TTPs & AADS Was: First Data Unit Says It's Untangling Authentication
https://www.garlic.com/~lynn/aadsm17.htm#9 Setting X.509 Policy Data in IE, IIS, Outlook
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: Newsgroups (Was Another OS/390 to z/OS 1.4 migration Newsgroups: bit.listserv.ibm-main,alt.folklore.computers Date: 25 Jun 2005 13:12:34 -0600Eric-PHMining@ibm-main.lst (Eric Bielefeld) writes:
the internal network was larger than the arpanet/internet from just
about the start up until around summer of '85.
https://www.garlic.com/~lynn/subnetwork.html#internalnet
at the great switch over from host/imp arpanet to internetworking
protocol on 1/1/83 ... the arpanet was around 250 nodes. by comparison,
not too long afterwards, the internal network passed 1000 nodes:
https://www.garlic.com/~lynn/internet.htm#22
i've claimed that one of the possible reasons was that the major internal networking nodes had a form of gateway functionality built into every node ... which the arpanet/internet didn't get until the 1/1/83 cut-over to internetworking protocol.
during this period in the early 80s ... there was some growing internal anxiety about this emerging internal networking prevalence ... that had largely grown up from the grassroots.
there were all kinds of efforts formed to try and study and understand what was happening. once such effort even brought in hiltz and turoff (the network nation) to help study what was going on.
also, there was a researcher assigned to set in the back of my
office. they took notes on how i communicated in face-to-face (also
going to meetings with me), on the phone ... and they also had access
to contents of all my incoming and outgoing email as well as logs of
all my instant messages. this went on for 9 months ... the report
also turned into a stanford phd thesis (joint with language and
computer AI) ... as well as material for subsequent papers and
books. some references included in collection of postings on computer
mediated communication
https://www.garlic.com/~lynn/subnetwork.html#cmc
one of the stats was that supposedly for the 9 month period, that I exchanged email with an avg. of 275-some people per week (well before the days of spam).
later in the 80s ... there was the nsf network backbone RFP. we
weren't allowed to bid ... but we got an nsf study that reported that
the backbone that we were operating was at least five years ahead of
all bid submissions to build the nsfnet backbone.
https://www.garlic.com/~lynn/internet.htm#0
https://www.garlic.com/~lynn/subnetwork.html#hsdt
the nsf network backbone could be considered the progenitor of the modern internet .... actually deploying a backbone for supporting network of networks (aka an operational characteristic that goes along with the internetworking protocol technology).
note that into the early and mid-90s ... much of the newsgroups were still riding the usenet/uucp rails (not yet having moved to internet in any significant way) ... and people having either a direct usenet feed or having access via some BBS that had a usenet feed. Circa 93, I co-authored an article for boardwatch (bbs industry mag) about drivers I had done for a full usenet satellite broadcast feed.
the "ISPs" of this era were typically offering shell accounts and/or UUCP accounts (predating PPP and tcp/ip connectivity).
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 25 Jun 2005 13:35:26 -0600"Anders Rundgren" writes:
as mentioned ... it is unlikely that 3D (going directly from the merchant to the consumer financial institution) is actually replacing the existing payment message transport ... unless it is actually suggesting that the merchant financial institution is no longer involved representing the merchant ... and that the consumer financial institutions will be assuming all liability responsiblity for the merchant.
futhermore if you study the existing infrastructure ... not only does the federation of payments already exist ... but there are long term contractual trust vehicles in place that support that support that federaion of payments (between merchant, merchant financial instituation, association, consumer financial institution, and consumer).
if it isn't replacing the existing real-time, online, single round-trip, straight-through processing ... that directly involves all the financially responsible parties ... then presumably it is just adding a second, online, real-time transaction to an existing online, real-time transaction? (doubling the transaction and processing overhead).
one of the things that kindergartern, security 101 usually teaches is that if you bifurcate transaction operation in such a way ... you may be opening up unnecessary security and fraud exposures ... in addition to possibly doubling the transaction and processing overhead.
now, the design point for the stale, static, PKI model was for establishing trust for a relying party that had no other recourse about first time communication with a party where no previous relationship existed. Supposedly 3d (assuming that it is just adding a second realtime, online transaction to an already existing, realtime online transaction) is doubling the number and overhead of online, realtime transactions .... in addition to managing to craft in some stale, static PKI processing.
the AADS model doesn't do anything about federation or non-federation
of payments. AADS simply provides for providing improved
authentication technology integrated with standard business
operations:
https://www.garlic.com/~lynn/x959.html#aads
There have been some significant protocols defined over the past
several years ... where authentication was done as an independent
operation ... totally separate from doing authentication on the
transaction itself. In all such cases that I know of, it has been
possible to demonstrated man-in-the-middle (MITM) attacks
https://www.garlic.com/~lynn/subintegrity.html#mitm
where authentication is done separately from the actual transaction.
in the mid-90s the x9a10 financial standards working group was tasked
with preserving the integrity of the financial infrastructure for all
retail payments ... and came up with x9.59
https://www.garlic.com/~lynn/x959.html#x959
https://www.garlic.com/~lynn/subpubkey.html#privacy
which simply states that transaction is directly authenticated. some
recent posts (in totally different thread) going into some number of
infrastructure vulnerabilities and the x9.59 financial standard
countermeasures:
https://www.garlic.com/~lynn/aadsm19.htm#17 What happened with the session fixation bug?
https://www.garlic.com/~lynn/aadsm19.htm#32 Using Corporate Logos to Beat ID Theft
https://www.garlic.com/~lynn/aadsm19.htm#38 massive data theft at MasterCard processor
https://www.garlic.com/~lynn/aadsm19.htm#39 massive data theft at MasterCard processor
https://www.garlic.com/~lynn/aadsm19.htm#40 massive data theft at MasterCard processor
https://www.garlic.com/~lynn/aadsm19.htm#44 massive data theft at MasterCard processor
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 25 Jun 2005 14:23:43 -0600"Anders Rundgren" writes:
there has been some threads about having defense-in-depth.
the counter argument to defense-in-depth ... is a lot of the defense-in-depth strategies drastically increase the complexity of the infrastructure ... and frequently, it is complexity itself that opens up vulnerabilities and exploits.
the countermeasure to complexity vulnerabilities and exploits frequently is KISS ... where simpler actually wins out over defense-iu-depth and more complex. In part, defense-in-depth, while possibly creating overlapping layers ... frequently also creates cracks between such layers that allow the crooks to slip through.
a couple past threads mentioning defense-in-depth
https://www.garlic.com/~lynn/aepay11.htm#0 identity, fingerprint, from comp.risks
https://www.garlic.com/~lynn/2002j.html#40 Beginner question on Security
https://www.garlic.com/~lynn/aadsm19.htm#27 Citibank discloses private information to improve security
https://www.garlic.com/~lynn/2005b.html#45 [Lit.] Buffer overruns
numerous past posts mentioning KISS:
https://www.garlic.com/~lynn/aadsm2.htm#mcomfort Human Nature
https://www.garlic.com/~lynn/aadsm3.htm#kiss1 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss2 Common misconceptions, was Re: KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp-00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss3 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss4 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss5 Common misconceptions, was Re: KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss6 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss7 KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION :draft-ietf-pkix-scvp- 00.txt))
https://www.garlic.com/~lynn/aadsm3.htm#kiss8 KISS for PKIX
https://www.garlic.com/~lynn/aadsm3.htm#kiss9 KISS for PKIX .... password/digital signature
https://www.garlic.com/~lynn/aadsm3.htm#kiss10 KISS for PKIX. (authentication/authorization seperation)
https://www.garlic.com/~lynn/aadsm5.htm#liex509 Lie in X.BlaBla...
https://www.garlic.com/~lynn/aadsm7.htm#3dsecure 3D Secure Vulnerabilities?
https://www.garlic.com/~lynn/aadsm8.htm#softpki10 Software for PKI
https://www.garlic.com/~lynn/aepay3.htm#gaping gaping holes in security
https://www.garlic.com/~lynn/aepay7.htm#nonrep3 non-repudiation, was Re: crypto flaw in secure mail standards
https://www.garlic.com/~lynn/aepay7.htm#3dsecure4 3D Secure Vulnerabilities? Photo ID's and Payment Infrastructure
https://www.garlic.com/~lynn/aadsm10.htm#boyd AN AGILITY-BASED OODA MODEL FOR THE e-COMMERCE/e-BUSINESS ENTERPRISE
https://www.garlic.com/~lynn/aadsm11.htm#10 Federated Identity Management: Sorting out the possibilities
https://www.garlic.com/~lynn/aadsm11.htm#30 Proposal: A replacement for 3D Secure
https://www.garlic.com/~lynn/aadsm12.htm#19 TCPA not virtualizable during ownership change (Re: Overcoming the potential downside of TCPA)
https://www.garlic.com/~lynn/aadsm12.htm#54 TTPs & AADS Was: First Data Unit Says It's Untangling Authentication
https://www.garlic.com/~lynn/aadsm13.htm#16 A challenge
https://www.garlic.com/~lynn/aadsm13.htm#20 surrogate/agent addenda (long)
https://www.garlic.com/~lynn/aadsm15.htm#19 Simple SSL/TLS - Some Questions
https://www.garlic.com/~lynn/aadsm15.htm#20 Simple SSL/TLS - Some Questions
https://www.garlic.com/~lynn/aadsm15.htm#21 Simple SSL/TLS - Some Questions
https://www.garlic.com/~lynn/aadsm15.htm#39 FAQ: e-Signatures and Payments
https://www.garlic.com/~lynn/aadsm15.htm#40 FAQ: e-Signatures and Payments
https://www.garlic.com/~lynn/aadsm16.htm#1 FAQ: e-Signatures and Payments
https://www.garlic.com/~lynn/aadsm16.htm#10 Difference between TCPA-Hardware and a smart card (was: example:secure computing kernel needed)
https://www.garlic.com/~lynn/aadsm16.htm#12 Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)
https://www.garlic.com/~lynn/aadsm17.htm#0 Difference between TCPA-Hardware and a smart card (was: example: secure computing kernel needed)<
https://www.garlic.com/~lynn/aadsm17.htm#41 Yahoo releases internet standard draft for using DNS as public key server
https://www.garlic.com/~lynn/aadsm17.htm#60 Using crypto against Phishing, Spoofing and Spamming
https://www.garlic.com/~lynn/aadsmail.htm#comfort AADS & X9.59 performance and algorithm key sizes
https://www.garlic.com/~lynn/aepay10.htm#76 Invisible Ink, E-signatures slow to broadly catch on (addenda)
https://www.garlic.com/~lynn/aepay10.htm#77 Invisible Ink, E-signatures slow to broadly catch on (addenda)
https://www.garlic.com/~lynn/aepay11.htm#73 Account Numbers. Was: Confusing Authentication and Identiification? (addenda)
https://www.garlic.com/~lynn/99.html#228 Attacks on a PKI
https://www.garlic.com/~lynn/2001.html#18 Disk caching and file systems. Disk history...people forget
https://www.garlic.com/~lynn/2001l.html#1 Why is UNIX semi-immune to viral infection?
https://www.garlic.com/~lynn/2001l.html#3 SUNW at $8 good buy?
https://www.garlic.com/~lynn/2002b.html#22 Infiniband's impact was Re: Intel's 64-bit strategy
https://www.garlic.com/~lynn/2002b.html#44 PDP-10 Archive migration plan
https://www.garlic.com/~lynn/2002b.html#59 Computer Naming Conventions
https://www.garlic.com/~lynn/2002c.html#15 Opinion on smartcard security requested
https://www.garlic.com/~lynn/2002d.html#0 VAX, M68K complex instructions (was Re: Did Intel Bite Off MoreThan It Can Chew?)
https://www.garlic.com/~lynn/2002d.html#1 OS Workloads : Interactive etc
https://www.garlic.com/~lynn/2002e.html#26 Crazy idea: has it been done?
https://www.garlic.com/~lynn/2002e.html#29 Crazy idea: has it been done?
https://www.garlic.com/~lynn/2002i.html#62 subjective Q. - what's the most secure OS?
https://www.garlic.com/~lynn/2002k.html#11 Serious vulnerablity in several common SSL implementations?
https://www.garlic.com/~lynn/2002k.html#43 how to build tamper-proof unix server?
https://www.garlic.com/~lynn/2002k.html#44 how to build tamper-proof unix server?
https://www.garlic.com/~lynn/2002m.html#20 A new e-commerce security proposal
https://www.garlic.com/~lynn/2002m.html#27 Root certificate definition
https://www.garlic.com/~lynn/2002p.html#23 Cost of computing in 1958?
https://www.garlic.com/~lynn/2003b.html#45 hyperblock drift, was filesystem structure (long warning)
https://www.garlic.com/~lynn/2003b.html#46 internal network drift (was filesystem structure)
https://www.garlic.com/~lynn/2003c.html#66 FBA suggestion was Re: "average" DASD Blocksize
https://www.garlic.com/~lynn/2003d.html#14 OT: Attaining Perfection
https://www.garlic.com/~lynn/2003h.html#42 IBM says AMD dead in 5yrs ... -- Microsoft Monopoly vs
https://www.garlic.com/~lynn/2003.html#60 MIDAS
https://www.garlic.com/~lynn/2003m.html#33 MAD Programming Language
https://www.garlic.com/~lynn/2003n.html#37 Cray to commercialize Red Storm
https://www.garlic.com/~lynn/2004c.html#26 Moribund TSO/E
https://www.garlic.com/~lynn/2004e.html#26 The attack of the killer mainframes
https://www.garlic.com/~lynn/2004e.html#30 The attack of the killer mainframes
https://www.garlic.com/~lynn/2004f.html#58 Infiniband - practicalities for small clusters
https://www.garlic.com/~lynn/2004f.html#60 Infiniband - practicalities for small clusters
https://www.garlic.com/~lynn/2004g.html#24 |d|i|g|i|t|a|l| questions
https://www.garlic.com/~lynn/2004h.html#51 New Method for Authenticated Public Key Exchange without Digital Certificates
https://www.garlic.com/~lynn/2004q.html#50 [Lit.] Buffer overruns
https://www.garlic.com/~lynn/2005.html#10 The Soul of Barb's New Machine
https://www.garlic.com/~lynn/2005.html#12 The Soul of Barb's New Machine
https://www.garlic.com/~lynn/2005c.html#22 [Lit.] Buffer overruns
https://www.garlic.com/~lynn/aadsm19.htm#27 Citibank discloses private information to improve security
https://www.garlic.com/~lynn/2005i.html#19 Improving Authentication on the Internet
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: Bank of America - On Line Banking *NOT* Secure? Newsgroups: sci.crypt Date: 25 Jun 2005 15:24:21 -0600"John E. Hadstate" writes:
1) spoofed website and/or MITM attack 2) evesdropping
there have been mechanisms that allow key exchange that don't require certificates & CAs ... that then would allow encrypted sessions as countermeasure for evesdropping.
the SSL domain name certificates were supposed to provide that the
domain name that you typed in for the URL ... matched the domain name
provided in an SSL domain name certificate (from the server)
https://www.garlic.com/~lynn/subpubkey.html#sslcert
and subsequently leveraged to provide key exchange with the valid
end-point and end-to-end encryption.
the problem was that a lot of merchants ... considering the original
SSL target for e-commerce
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3
... found that they got something like five-times the thruput using non-SSL. The result is that the merchants avoided using SSL & https for non-evesdropping scenarios ... reserving it solely for evesdropping like operations. in the e-commerce scenario, that typically met the user got to eventually click on a "check-out" or "pay" button, which, in turn invoked SSL for the payment phase.
the problem was that the URL the user provided was never checked against the certificate of the site the user was visiting. So if the user happened to be dealing with a spoofed site ... when they finally got to the "pay" button ... the "pay" button generated a URL (on the user's behalf) and if it happened to be a spoofed site, it was highly likely that the URL that the spoofed site provided as part of the "pay" button, was highly likely to match whatever was in an SSL domain name certificate from the server that the user had been directed to.
the issue is that if there really is a spoofed site vulnerability and that the user might happen to be visiting a spoofed site (which is in large part the justification for SSL, ssl domain name certificates, certification authorities, etc) ... then nothing such a suspect site does or provides should be trusted ... including any javascript or other html related stuff that invokes ssl (as a evesdropping countermeasure) ... since it may also be to another fraudulent site (and they are keeping the other crooks from evesdropping on their spoofed communication).
spoofed site technology can either be straight spoofed site ... its
own site with all its own files providing the look & feel of the real
site. A spoofed site might also be done as man-in-the-middle attack
... where the spoofed site is actually acting as a middle man between
the end-user and the real site ... although possibly subtly modifying
communication passing thru:
https://www.garlic.com/~lynn/subintegrity.html#mitm
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: Newsgroups (Was Another OS/390 to z/OS 1.4 migration Newsgroups: bit.listserv.ibm-main,alt.folklore.computers Date: 25 Jun 2005 21:30:29 -0600David Scheidt writes:
re:
https://www.garlic.com/~lynn/2005l.html#16 Newsgroups (Was Another OS/390 to z/OS 1.4 migration
old usenet newsgroup posting from the feed:
Path: wheeler!pagesat!olivea!hal.com!darkstar.UCSC.EDU!osr From: vern@daffy.ee.lbl.gov (Vern Paxson) Newsgroups: comp.os.research Subject: Paper on wide-area TCP growth trends available for ftp Date: 13 May 1993 17:52:04 GMT Lines: 34 Approved: comp-os-research@ftp.cse.ucsc.edu Message-ID: <1su1s4INNsdj@darkstar.UCSC.EDU> NNTP-Posting-Host: ftp.cse.ucsc.edu Originator: osr@ftp *** EOOH *** From: vern@daffy.ee.lbl.gov (Vern Paxson) Newsgroups: comp.os.research Subject: Paper on wide-area TCP growth trends available for ftp Date: 13 May 1993 17:52:04 GMT Originator: osr@ftp The following paper is now available via anonymous ftp to ftp.ee.lbl.gov. Retrieve WAN-TCP-growth-trends.ps.Z (about 100KB): Growth Trends in Wide-Area TCP Connections Vern Paxson Lawrence Berkeley Laboratory and EECS Division, University of California, Berkeley vern@ee.lbl.gov We analyze the growth of a medium-sized research laboratory's wide-area TCP connections over a period of more than two years. Our data consisted of six month-long traces of all TCP connections made between the site and the rest of the world. We find that {\em smtp\/}, {\em ftp\/}, and {\em X11} traffic all exhibited exponential growth in the number of connections and bytes transferred, at rates significantly greater than that at which the site's overall computing resources grew; that individual users increasingly affected the site's traffic profile by making wide-area connections from background scripts; that the proportion of local computers participating in wide-area traffic outpaces the site's overall growth; that use of the network by individual computers appears to be constant for some protocols ({\em telnet}) and growing exponentially for others ({\em ftp\/}, {\em smtp\/}); and that wide-area traffic geography is diverse and dynamic. If you have trouble printing it let me know and I'll mail you hardcopy. Vern Vern Paxson vern@ee.lbl.gov Systems Engineering ucbvax!ee.lbl.gov!vern Lawrence Berkeley Laboratory (510) 486-7504backyard full usenet feed
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 26 Jun 2005 08:30:30 -0600"Anders Rundgren" writes:
for their financial institution in their trusted public key store.
this also would eliminate many of the bank site spoofing
vulnerabilities ... recent discussion
https://www.garlic.com/~lynn/2005l.html#19
in the above ... it discusses various kinds of spoofing and
MITM-attacks ... where the end user is provided with a URL ... rather
than entering it themselves. Then you have an exploit of SSL ...
which is only verifying the domain name in the entered URL against the
domain name in the supplied certificate. If you aren't entering the
URL ... but it is being provided by an attacker ... then they are
likely to provide a URL that corresponds to a certificate that they
have valid rights for. This has been a long recognized characteristic.
https://www.garlic.com/~lynn/subpubkey.html#sslcert
A consumer, having vetted a bank's public key for storing in their own trusted public key repository ... then can use that vetted public key for future communication with their financial institution ... and not be subject to vulnerabilities and exploits of an externally provided (certificate-based) public key that has had no vetting .. other than it is a valid public key and belongs to somebody.
The purpose for PKI has been for allowing relying parties to establish some level of trust when dealing with first-time encounters with entities that are otherwise complete strangers ... and the relying party has no other recourse for accessing information to establish trust. The design point was somewhat from the early 80s when there was much lower level of online connectivity and relying parties frequently operated in offline environment.
With the ubiquitous proliferation of the internet, those offline pockets are being drastically reduced. Somewhat as a result, some PKIs have attempted to move into the no-value market segment ... where a relying party is online ... but the value of the operation doesn't justify performing a online transactions. The issue user is that as the internet becomes much more pervasive ... the cost of online internet operations are radically dropping ... which in turn is drastically reducing the no-value situations that can't justify an online operation.
Presumably in the 3d secure PKI scenario, it has a financial institution's CC-specific certificate that is targeted specifically at relying parties that have had no prior dealings with that financial institution(*?*).
Presumably this implies the merchant as a relying party in dealing with the consumer's financial institution (the other alternative is possibly the consumer as a relying party in dealing with the merchant's financial institution ... but I have seen nothing that seems to support that scenario). Now, going back to well before the rise of PKI to address the offline trust scenario ... the payment card industry had online transactions that went from the merchant through a federated infrastructure all the way to the consumer's financial instititon and back as straight through processing. This included contractual trust establishment with various kinds of obligations and liabilities ... that included the consumer's financial institution assuming certain liabilities on behalf of the consumer and the merchant's financial institution assuming certain liabilities on behalf of the merchant. Possibly because of these obligations ... both financial institutions have interest in the transaction passing through them.
As mentioned before ... it appears that 3d secure doesn't eliminate the existing online real-time transaction that conforms to some significant contractual and liability obligations. 3d secure appears to add an additional, 2nd online transaction ... allowing the merchant to be directly in communication with the consumer's financial institution (bypassing the established contractual and liability obligations involving the merchant's financial institution). Furthermore, this 3d secure appears to include a PKI certificate ... targeted at establishing trust where the relying party has no other recourse for trust establishment. However, the merchant is already covered under the contractual trust operations that have been standard business practice for decades.
So what possible motivation is there for a merchant to add additional overhead and processing(*?*).
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: The Worth of Verisign's Brand Newsgroups: netscape.public.mozilla.crypto Date: 26 Jun 2005 09:36:46 -0600"Anders Rundgren" writes:
the issue in the x9.59 standards word
https://www.garlic.com/~lynn/x959.html#x959
https://www.garlic.com/~lynn/subpubkey.html#privacy
was the treats and vulnerabilities in the authentication technology and that the integrity level has possibly eroded over the past 30 years or so (in the face of technology advances).
the primary issue was the authentication of the consumer for the transaction. this cropped up in two different aspects
1) is the consumer originating the transaction, really the entity that is authorized to perform transactions against the specific account
2) somewhat because of authentication integrity issues, starting at
least in the 90s, there was an increase in skimming and harvesting
... either direct skimming of the magnetic stripe information or
harvesting of account transaction databases ... both supporting later
counterfeiting activities enabling generation of fraudulent
transactions
https://www.garlic.com/~lynn/subintegrity.html#harvest
the countermeasure corner stones of x9.59 then became:
1) use technology for drastically increasing the authentication strength directly associated with transactions ... as a countermeasure to not being sure that the entity originating the transaction is really the entity authorized to perform transactions for that account.
2) business rule that PANs (account numbers) used in strongly authenticated transactions aren't allowed to be used in poorly or non-authentication transactions (or don't authorize poorly authenticated transaction having a PAN that is identified for use only in strongly authenticated transactions). this is a countermeasure to the skimming/harvesting vulnerabilities and exploits.
there was a joke with regard to the second countermeasure
corner stone that you could blanket the world in miles deep
cryptography and you still couldn't contain the skimming/harvesting
activities. the second corner stone just removes skimming/harvesting
as having any practical benefit in support of crooks and fraudulent
transactions. slightly related post on security proportional to risk:
https://www.garlic.com/~lynn/2001h.html#61
recent similar posting in another thread:
https://www.garlic.com/~lynn/2005k.html#23
https://www.garlic.com/~lynn/aadsm16.htm#20
having helped with the deployment of the e-commerce ssl based
infrastructure
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3
we recognized a large number of situations where PKIs that had originally been designed to address trust issues between relying parties and other entities that had no prrevious contact ... were being applied to environments that had long term and well established trust and relationship management infrastructures (aka if one has a relationship management infrastructure that provides long-term and detailed trust history about a specific relationship ... then a PKI becomes redundant and superfluous as a trust establishment mechanism).
In the AADS model
https://www.garlic.com/~lynn/x959.html#aads
involving certificate-less public key operation
https://www.garlic.com/~lynn/subpubkey.html#certless
we attempted to map publickey-based authentication technology into existing and long-term business processes and relationship management infrastructures.
the existing authentication landscape is largely shared-secret based
https://www.garlic.com/~lynn/subintegrity.html#secret
where the same information that is used for originating a transaction is also used for verifying a transaction. this opens up harvesting vulnerabilities and treats against the verification repositories.
basically asymmetric cryptography is a technology involving pairs of keys, data encoding by one key is decoded by the other key.
a business process has been defined for asymmetric cryptography where one of the key pair is designated "public" and can be widely distributed. The other of the key pair is designated "private" and kept confidential and never divulged.
a futher business process has been defined callrf "digital signatures" where a hash of some message or document is encoded with a private key. later a relying party can recalculate the hash of the same message or document, decode the digital signatures with the corresponding public key and compare the two hashes. if the two hashes are the same ... then the relying party can assume:
1) the message/document hasn't been modified since being digitally signed
2) something you have authentication, aka the originating entity has access to, and use of the corresponding private key.
an additional business process was created called PKIs and certification authorities that was targeted at the environment where a relying party is dealing with first time communication with a stranger and has no other recourse for trust establishment about the total stranger. note however, that PKIs and certification authorities can be shown to be redundant and superfluous in environments where the relying party has long established business processes and trust/relationship management infrastructures for dealing with pre-existing relationships.
However, just because PKIs and certification authority business process can be shown to be redundant and superfluous in most existing modern day business operations ... that doesn't preclude digital signature technology being used (in a certificate-less environment) as a stronger form of authentication (relying on existing and long estasblished relationship management processes for registering a public key in lieu of shared-secret based authentication material).
leveraging long established relationship management infrastructures for registering public key authentication material in lieu of shared-secret authentication material (and use of public key oriented authentication) is a countermeasure to many kinds of harvesting and skimming vulnerabilities and threats. Many of the identity theft reports result from havesting/skimming of common, static, shared-secret authentication material for later use in fraudulent transactions. The advantage of public key based authentication material, is that while it can be used for authentication purposes, it doesn't have the short-coming of also being usable for originating fraudulent transactions and/or impersonation.
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: lynn@garlic.com Newsgroups: netscape.public.mozilla.crypto Subject: Re: The Worth of Verisign's Brand Date: Mon, 27 Jun 2005 08:45:04 -0700Anders Rundgren wrote:
in the mid-90s, one of the pki oriented payment structures had the financial insitutions registering public keys and issuing relying-party-only certificates.
the issue wasn't with the registering of the public keys ... since the financial insitutions have well established relationship management infrastructures.
the problem was trying to mandate that simple improvement in authentication technology be shackled to an extremely cumbersome and expensive redundant and superfluous PKI infrastrucutre.
the other issue ... was that the horribly complex, heavyweight and
expensive PKI infrastructure had limited their solution to only
addressing evesdropping of transactions in-flight ... which was
already adequately addressed by the existing e-commerce SSL solution
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3
https://www.garlic.com/~lynn/subpubkey.html#sslcert
and was providing no additional improvement in the integrity landscape.
so you have a simple and straight-forward mechanism for minor technology improvement in authentication ... schackled to a horribly complex, expensive, redundant and superfluous PKI operation which was providing no additional countermeasures to the major e-commerce threats and vulnerability (than the existing deployed SSL solution).
Now if you were a business person and was given an alternative between two solutions that both effectively addressed the same subset of e-commerce vulnerabilities and threats ... one, the relatively straight-forward and simple SSL operation and the other a horribly complex, expensive, redundant and superfluous PKI operation .... which would you choose?
An additional issue with the horribly complex, expesnive, redundant and
superfluous PKI based solutions were the horrible payload bloat
represented by the relying-party-only certificates
https://www.garlic.com/~lynn/subpubkey.html#rpo
was that the typicaly payment message payload size is on the order of 60-80 bytes ... the attachment redundant and superfluous relying-party-only digital certificates represented a payload size on the order of 4k-12k bytes .... or a horrible payload bloat increase by a factor of one hundred times.
As mentioned in the previous posting,
https://www.garlic.com/~lynn/2005l.html#22 The Worth of Verisign's Brand
the x9a10 financial standard working group which was tasked with
preserving the integrity of the financial infrastructure for all
retail payments actually attempted to address major additional threats
and vulnerabilities with x9.59
https://www.garlic.com/~lynn/x959.html#x959
and there was actually a pilot project that was deployed for iso 8583
nacha trials ... see references at
https://www.garlic.com/~lynn/x959.html#aads
part of the market acceptance issue is that the market place has been so saturated with PKI oriented literature .... that if somebody mentions digital signature ... it appears to automatically bring forth images of horribly expensive, complex, redundant and superfluous PKI implementations
From: lynn@garlic.com Newsgroups: netscape.public.mozilla.crypto Subject: Re: The Worth of Verisign's Brand Date: Mon, 27 Jun 2005 09:18:39 -0700Anders Rundgren wrote:
is that there is absolutely no changes to existing infrastructures, business processes and/or message flows ... they all stay the same ... there is just a straight-forward upgrade of the authentication technology (while not modifying existing infrastructures, business process, and/or message flows).
aggresive cost optimization for a digital signature only hardware token would result in negligiible difference between the fully-loaded roll-out costs for the current contactless, RFID program and the fully-loaded costs for nearly identical operation for a contactless, digital signature program.
the advantage over some of the earlier pki-oriented payment rollouts
https://www.garlic.com/~lynn/2005l.html#23
is that in addition to addressing evesdropping vulnerability for data-in-flight (already addressed by the simpler SSL-based solution) ... it also provides countermeasures for impersonation vulnerabilities as well as numerous kinds of data breach and identity theft vulnerabilities.
is that in addition to addressing evesdropping vulnerability for
data-in-flight (already addressed by the simpler SSL-based solution)
... it also provides countermeasures for impersonation
vulnerabilities as well as numerous kinds of data breach and
identity theft vulnerabilities.
https://www.garlic.com/~lynn/2005l.html#22
From: lynn@garlic.com Subject: Re: PKI Crypto and VSAM RLS Newsgroups: bit.listserv.ibm-main Date: 29 Jun 2005 16:31:18 -0700Hal Merritt wrote:
a (public key) business process has been defined where one key is identified as public and made readily available. the other of the key pair is identified as private and kept confidential and never divulged. public keys can be registered in place of pin, passwords, and/or other shared-secrets for authentication. in a shared-secret environment .... somebody having access to the registered authentication information also has access to the same information that is used for origination and therefor can impersonate. in public key environment, somebody with access to the public key can only authenticate but can't also use the public key to impersonate.
there is additional business process that has been defined called digital signatures. in a digital signature, the originator computes the hash of a message and encodes it with their private key. they then transmit the message and the digital signature. the receiver then recomputes the hash of the message, decodes the digital signature with the public key (producing the original hash) and compares the two hashes. If they are equal, the recipient then has some assurance that 1) the message hasn't been altered and 2) authenticates the sender.
another business process has been defined called PKI (Public Key Infrastructures and involves certification authorities (CAs) and (digitally signed) digital certificates. This was somewhat targeted at the offline email environment of the early 80s; somebody dialed their local (electronic) postoffice, exchanged email, and hung up. They might now have to deal with a first time communication from a total stranger that they had never previously communicated ... and they had neither local resources and/or access to online resources as a basis for establishing trust in the total stranger (aka they couldn't call up credit bureaus, financial institutions, etc).
Basically the local user (or relying party) has a local trusted repositories of public keys .... public keys belonging to entities that they already trust. In PKI, this local trusted public key repository is extended to include the public keys of certification authorities (or CAs). CAs register the public keys and other information about individual entities. They then create something they call a "digital certificate" which is a message containing the entity's registered information (including their public key) and is digitally signed with the CA's private key.
Now, a total stranger originating some message for first time communication, can digitally sign the message; sending off a combination of the basic message, their digital signature, and the digital certificate that has been issued to them.
The recipient receives the first time communication from a total stranger, validates the attached digital certificate (by using the CA's public key from their trusted public key repository), extracts the stranger's public key from the digital certificate and validates the message digital signature ... and then processes the message. They can trust that the message originated from some entity which is described by the certified information in the attached digital certificate.
In the early 90s, there were PKI x.509 identity certificates where the CAs were pondering overloading the certificate with lots of personal information ... since they couldn't necessarily predict what kind of information future relying parties might be interested in (the more relying parties that found the x.509 certificates useful ... the more valuable the x.509 certificates and possibly the more that the CAs might be able to charge for the certificates).
In the mid-90s, some number of institutions were starting to realize
that x.509 identity certificates grossly overloaded with personal
information represented significant privacy and liability issues. As a
result, there was the introduction of replying-party-only certificates
https://www.garlic.com/~lynn/subpubkey.html#rpo
which just contained some sort of database lookup identifier (like an account number) and a public key. However, it is trivial to show that relying-party-only certificates are redundant and superfluous since it implies that the recipient already has a repository of all the necessary information and therefor doesn't require a digital certificate (which was designed to address the first time communication with a stranger where the recipient had no other available recourse to information about the originator).
Futhermore, with the Internet becoming more pervasive and ubiquitous, the situations where a recipient doesn't have other (real-time and online) avenues for information about stranger (in a first time communication) is rapidly dwindling. For some of the PKI CAs, they've attempted to move into the no-value market segment ... where a recipient might have available means to obtain information about a stranger .... however, the value of the operation doesn't justify the expense. A somewhat secondary issue is that as the Internet becomes more and more pervasive .... the cost of using it is also repidly declining ... further squeezing the no-value market segment where the recipient can't justify accessing realtime, online information.
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: ESCON to FICON conversion Newsgroups: bit.listserv.ibm-main Date: 29 Jun 2005 18:04:32 -0600nigel.salway@ibm-main.lst (Salway, Nigel) writes:
one of the rs/6000 engineers took a look at escon ... and somewhat modified it, increasing bit rate from around 200mbit/sec to 220mbit/sec, using significantly cheaper drivers. This was released with the original rs/6000 as serial link adapter (SLA).
we had been working with several labs. and industry standard groups, LLNL was somewhat taking a high-speed serial copper installation they had and was pushing it in the standards groups as fiber channel standard (at the time, fiber, 1gbit/sec, full-duplex, 2gbit/sec aggregate ... with none of the additional thruput degradation associated with the latencies involved in turning around a half-duplex connection).
The SLA engineer had started work on a 800-mbit (per second, full duplex) version ... but we managed to convince him to join the fiber channel standard work (where he became editor of the fiber channel standard document). By at least 1992, you were starting to see FCS connectivity.
One of the issues in the FCS standards group were some of the battles where traditional mainframe, half-duplex oriented engineers were attempting to layer half-dupliex "mainframe channel i/o" protocols on top of the underlying full-duplex fiber channel standard.
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: How does this make you feel? Newsgroups: comp.arch Date: 30 Jun 2005 09:27:35 -0600"John Mashey" writes:
1) execute instruction that crossed 4k boundary (2 pages) 2) SS instruction (target of the execute) that crossed 4k boundary (2 more pages) 3) source location of the SS instruction that crossed page boundary (2 more pages) 4) target location of the SS instruction that crossed page boundary (2 more pages)
the interruptable "long" instructions (introduced with 370) were not defined as having all required storage locations to be pre-checked; they were defined as being able to check on a byte-by-byate basis and causing an interrupt (with updated register values which allowed for restarting the instruction). I was involved in shooting a microcode bug on 370/125 (& 370/115) where the microcoders had incorrectly checked starting and ending locations on long instructions before starting (if something was wrong with the ending location, it would interrupt before starting the instruction ... which was correct for the 360 instructions but incorrect for the 370 "long" instructions).
in a recent discussion on this subject ... it has been brought to my attention that more recent machines have fixed a "bug" in the (original 360) translate SS instructions. translate instructions take a 256 character "table" that is used for changing or testing the source string. standard 360 involved checking the table starting address and the table ending address (start+256). However, a programmer that knew that they had a constrained set of characters in the input stream were allowed to define "short" tables (less than 256 bytes). However, the original instruction implementations would check worst case table ending address (start+256). the instruction bug fix is that if the start of a table is within 256 bytes of a boundary, the instruction is pre-executed, checking each byte in the input string for possible values that would address table byte on the other side of the boundary (aka the translate instructions took each input byte and added its value to the table start address to index a one byte field).
some recent postings
https://www.garlic.com/~lynn/2005j.html#36 A second look at memory access alignment
https://www.garlic.com/~lynn/2005j.html#38 virtual 360/67 support in cp67
https://www.garlic.com/~lynn/2005j.html#39 A second look at memory access alignment
https://www.garlic.com/~lynn/2005j.html#40 A second look at memory access alignment
https://www.garlic.com/~lynn/2005j.html#44 A second look at memory access alignment
https://www.garlic.com/~lynn/2005k.html#41 Title screen for HLA Adventure? Need help designing one
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: IBM/Watson autobiography--thoughts on? Newsgroups: alt.folklore.computers Date: 30 Jun 2005 16:19:01 -0600rpl writes:
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: Importing CA certificate to smartcard Newsgroups: netscape.public.mozilla.crypto Date: 01 Jul 2005 09:20:05 -0600"Vivek Chadha" writes:
asymmetric key cryptography is the technology ... what one key of a key pair encodes, the other of the key pair decodes (differentiated from symmetric key cryptography where the same key is used for encoding and decoding).
there is a business process defined called
public key
, where one of
the keys (of a key pair) is labeled "public" and freely
distributed. The other key is labled "private" and is kept
confidential aned never divulged.
layered on this is another business process called
digital signature
.
the originator computes a hash of a message, encodes the hash with
their private key. they then transmit the original message and the
attached "digita signature". the recipient decodes the digital
signature with the public key, recomputes the message hash and
compares the two hash values (the recomputed and the decoded). if they
are equal, then the recipient can assume 1) the message hasn't been
altered and 2) something you have authentication (aka the sender has
access to and use of the corresponding private key) ... aka from
3-factor authentication
• something you have
• something you know
• something you are
most business operations have long standing and well established
relationship management infrastructures. they use such relationship
management infrastructures to record things about the relationship
(address, current account balance, permissions, etc) as well as
authentication material ... in the past, frequently shared-secret
(mother's maiden name, SSN, pin or password). however, it is also
possible to use well established and long standing relationship
management infrastructures to also record public key as authentication
material. the advantage that public keys have over shared-secrets is
that with shared-secrets people with access to the relationship
management infrastructure can also use the shared-secret
authentication material for impersonation. the public key can only be
used for authentication (and not impersonation). there was a report in
the last year or so that something like 70 percent of account/identity
theft involves insiders.
business processes also have a requirement that their relationship management repository has integrity (from PAIN) .... aka that they can trust the information contained in the relationship management repository (in addition, there may be confidentiality requirements if the repository contains shared-secret authentication material, since that information could be used to impersonate).
there are additional business processes which has an original design point of offline email from the ealy 80s. this has several pieces; digital certificates, certification authorities, PKI, etc. In the early 80s, a recipient dialed their local (electronic) post office, exchanged email and hung up. they now were potentially faced with processing first time communication from a total stranger (and had no recourse to local or other information about the total stranger).
The total stranger registers their public key and other information in a relationship management infrastructure run by a certification authority. The certification authority then creates specially formated a digitally signed message called a digital certificate (containing the registered information). Now a stranger, with first time communication, creates a message, digitally signs it and sends off the message, the digital signature and the digital certificate.
first off, a recipient has expended their trusted repository to include the authentication public keys of some number of these certification authorities. when the recipient receives a first time, digitally signed message from a total stranger ... with an attached digital certificate ... they then can use the CA's public key (from their trusted repository) to validate the (digital signature on the) digital certificate. then they can use the sender's public key from the digital certificate to validate the message's digital signature. they use the additional information contained in the digital certificate (copied from the certification authority's relationship management repository) in the processing of first time communication from a total stranger. this is an alternative to the recipient already having the stranger's public key directly registered in the recipient's relationship management infrastructure.
the digital certificate format is sometimes convenient for transporting CA public keys ... with the objective that they are sent to a relying party who then can load the associated public key into their trusted relationship repository. they typically aren't digital certificates in the business process definition since they are frequently self-signed and the receiving relying party must perform some other process before deciding to load the associated public key into their trusted relationship repository.
In the early 90s, you were finding some certification authorities looking at grossly overloading x.509 identity certificates with personal information (because they weren't able to predict who the recipients were going to be and/or what their requirements might be when dealing with total stranger). In such PKI/CA business process, these recipients are also referred to as relying parties (because they are relying on the information supplied by the certification authorities in the digital certificate).
In the mid-90s, you started to see some institutions realizing that x.509 identity certificates, grossly overloaded with personal information, represented significant privacy and liability issues. In this time-frame you saw some retrenchment to relying-party-only certificates. These relying-party-only certificates effectively contained some sort of database lookup index (like an account number) and a public key. Howevere it is trivial to show that such relying-party-only certificates are redundant and superfluous i.e. by definition, the relying party already has access to all the necessary information in their relationship management infrastructure.
In some PKI/CA payment initiatives from the mid-90s, they also found out that such relying-party-only certificates represented severe payload bloat. The typical payment message size is on the order of 60-80 bytes while the relying-party-only certificate overhead ran 4k-12k bytes. Not only was it trivial to show that such relying-party-only certificates were redundant and superfluous, but they also represented a factor of 100 times increase in payload bloat (which is pretty significant for something not serving any useful purpose).
as addenda ... with the drastic cost reductions for online connectivity and the ubiquitous availability of the internet, some of the CA/PKI operations are looking at moving into the no-value market segment (since the offline market segment where the relying party had no other recourse to information about first time communication with a stranger, is rapidly disappearing). the no-value market segment is where the relying party can't justify the cost of an online transaction to determine information about first-time communication with a stranger. One issue is that the no-value market segment probably isn't looking at spending a great deal of money on digital certificates.
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: A good argument for XML Newsgroups: comp.databases.theory Date: 01 Jul 2005 10:47:52 -0600Gene Wirchenko writes:
by "G", "M", and "L". and gml support added to the existing cms script
processing command. later in the 70s, it was standardized in iso
as SGML
https://www.garlic.com/~lynn/submain.html#sgml
also in somewhat the same mid-70s time-frame ... the original
relational, sql implementation (system/r) was done on the
same platform at sjr
https://www.garlic.com/~lynn/submain.html#systemr
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: More Phishing scams, still no SSL being used... Newsgroups: netscape.public.mozilla.crypto Date: 01 Jul 2005 12:17:14 -0600"Vivek" writes:
OCSP sort of came on the scene in the mid-90s after I was pointing out that suggestions regarding converting the payment card network to "modern" PKI was actually a technology regression of 20 or more years.
the credit card industry was doing offline processes with plastic credentials and monthly invalid account booklets mailed to all merchants every month, then weekly, then possibly looking at printing tens of millions of invalid account booklets and mailing them out every day.
so instead, they transition to online transactions by adding magstripe to existing plastic credential. now rather than relying on stale, static credential information, they could do real live, online transaction (and poof goes the problem of mailing out tens of millions of account invalidation booklets every couple hrs).
the observation is that OCSP goes to all the overhead and expense of having an online transaction ... but actually is returning very little useful information.
If you started suggesting that OCSP should start returning actual, useful information, then somebody might conclude that you get rid of the certificates all together and just go to a real online transaction (instead of a psuedo offline infrastructure with most of the downside of being offline but having most of the overhead of also having online transaction).
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: More Phishing scams, still no SSL being used... Newsgroups: netscape.public.mozilla.crypto Date: 01 Jul 2005 14:30:05 -0600Ram A Moskovitz writes:
i was getting hit with wouldn't it be a modern marvel to convert the payment infrastructure to certificates ... and then pointing out to them, that conversion to an offline certificate-based operation would actually represent regressing the payment infrastructure by at least 20 years.
when we were doing this thing with this small client/server startup
that wanted to do payments
https://www.garlic.com/~lynn/aadsm5.htm#asrn2
https://www.garlic.com/~lynn/aadsm5.htm#asrn3
we had to do due diligence on the major operations that were going to be issuing these things called SSL domain name certificates in support of the operation. we were constantly pointing out that most of them were actually doing certificate manufacturing (a term we coined at the time) and hadn't actually bothered to implement a real PKI that actually administered and managed the infrastructure. furthermore, the payment infrastrucutre had learned at least 25 years earlier that revokation lists scaled extremely poorly.
the following is from the PKIX WG minutes apr 7-8 1997
Sharon Boeyen presented the work to date on Part 2 regarding the use of
LDAP and FTP for retrieval of certificates and CRLs and the requirements
for and specification of an Online Certificate Status Protocol (OCSP).
DISCUSSION
1 - Should we consider splitting the document into two separate ones,
since the OCSP is a new protocol definition which may require
significant more review and discussion than the LDAP and FTP profiles?
Resolution: Although we agree that OCSP may require additional review,
the document will remain a single draft and we will re-address this
issue, if the OCSP discussion is such that it will require a longer
review period and impede progression of the remainder of the document.
.....
I have the original email ... but it can also be found here
http://www.imc.org/ietf-pkix/old-archive-97/msg00316.html
some comment about the lead architect for ocsp was from valicert
http://www.rsasecurity.com/press_release.asp?doc_id=334&id=1034
in addition to ocsp ... about the same time there were some other
infrastructures looking at various gimmicks to improve the revokation
process. note in the following announcement ... they were almost
quoting me word-for-word about how archaic the CRL process actually
is.
Date: Wed, 29 Oct 1997 21:05:12 -0800
SUBJECT: VALICERT TACKLES FLAW IN E-COMMERCE SECURITY
A group of Silicon Valley entrepreneurs has set out to correct a flaw
in the digital certification process that many Internet experts have
been counting on to make Internet commerce secure.
The solution, called a certificate revocation tree, is the property of
Valicert Inc., a Sunnyvale, Calif., company formed last year and
officially opened for business this week.
In a sign that Valicert may be on to something that could bring added
security to Internet transactions, three vendors in the data
encryption field have given endorsements, and Netscape Communications
Corp. has made a provision for Valicert's technology to "plug in" to
the SuiteSpot server software.
The advent of Valicert indicates that digital certification-a
cryptographic technique that is believed to be on the road to broad
public acceptance through Internet security protocols such as the
credit card industry's SET-needs further refinement. "Today there is
no way to know if a certificate is valid at the time of a
transaction-it is known only that the certificate was valid at the
time of issuance," said Joseph "Yosi" Amram, president and chief
executive officer of Valicert.
He said that if not for the Valicert method of keeping revoked
certificates from being approved-it will be available in the form of a
tool kit for system developers, a server system, and a service from
Valicert-electronic commerce could collapse under the weight of
millions of digital certificates that cannot be adequately validated.
SET, the Secure Electronic Transactions protocol adopted by MasterCard
and Visa for on-line credit card transactions, illustrates the problem
in the extreme. SET requires issuance of digital certificates to all
parties to a transaction. They are the E-commerce equivalent of a
driver's license to verify a cardholder's identity or a certification
that an on-line merchant is what it claims to be. The complexity of
processing transactions with those multiple certificates is widely
seen as slowing the adoption of SET. But digital certificates have
already been issued by the millions through Netscape and Microsoft
Corp.'s Internet browsers. Verisign Inc. and GTE Corp. are prominent
certificate vendors. GTE, Entegrity Solutions, and Entrust
Technologies, the leader in public key infrastructure systems, have
each agreed to some form of collaboration with Valicert.
Valicert's efforts can "expand the security infrastructure available
for commerce," said Tom Carty, vice president of marketing and
business development at GTE. "Given our focus on providing all of the
pieces of the infrastructure required to make Internet commerce
possible, it makes great sense for us to partner with Valicert to fill
in one of the most essential pieces of the infrastructure puzzle-the
digital credential checkpoint."
In a recent interview, Mr. Amram and Valicert chairman Chini Krishnan
said the problem is akin to what the credit card industry faced before
electronic authorization systems.
"A merchant would get a book, which came once a week or once a month,
full of bad credit card numbers, and credit cards presented at the
point of sale would have to be looked up manually," said Mr. Amram,
who joined Valicert in August after being involved in other high-tech
start-ups and in the Silicon Valley venture capital scene. "It was a
big hassle and it slowed down checkout."
The digital certificate equivalent of the hot-card list is known as
the certificate revocation list, or CRL.
Mr. Krishnan, the Valicert founder, said CRLs are "unscalable,"
meaning they become cumbersome, if not impossible, to manage as they
approach mass-market proportions. The lack of scalability "has posed a
barrier to widespread deployment," Mr. Krishnan said. He claimed that
the invention of the certificate revocation tree brings a "1,000-to-1
advantage" that solves the problem of revocation and validation in a
tamper-proof and economical way.
"Developers need a cost-effective, one-step solution for building
applications that can check the validity of digital certificates," Mr.
Amram said. "By providing a clearing house network into multiple
certification authorities, and by delivering a robust technology
combined with a liberal licensing policy, Valicert will enable the
widespread development and use of applications that will make the
Internet and corporate intranets safe to conduct business."
"Certificates are the only way to deal with identity in any meaningful
way," Mr. Amram said. "They will take off in a big way. But
certificates without validation are like a car without brakes."
Mr. Krishnan said the development of Valicert's technology had "a lot
of rocket science elements," which is why it took the company 20
months to reach the launch stage. Enhancing its credentials, Paul
Kocher, a leading cryptography researcher, is credited with inventing
the underlying technology. Martin Hellman, a Stanford University
professor and half of the Diffie-Hellman team that invented public key
cryptography, is on Valicert's scientific advisery board.
Commercializers of cryptographic security have been intrigued by
Valicert's proposition. When he heard about it during American
Banker's Online '97 conference in Phoenix, Scott Dueweke, a marketing
manager in International Business Machines Corp.'s Internet division,
said, "They should call us."
Another expert, who asked not to be identified, said Valicert's
biggest problem is that it is a few years ahead of its time. "The
market has fallen down with respect to revocation management, relying
on relatively short expiration dates" to minimize invalid
certificates, said Victor Wheatman, a California-based analyst with
Gartner Group, Stamford, Conn. "Valicert fills a void and hopes to
develop technology before the leading players move forward with their
own revocation capabilities."
Valicert's server and tool kit are available now, and its service to
certificate acceptors will enter field trials later this year, the
company said. The tool kit can be downloaded from the valicert.com Web
site free for noncommercial use and evaluation purposes. Application
development licenses are a flat $995 with unlimited sublicense rights.
The server can be deployed on corporate intranets for $9,995.
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: More Phishing scams, still no SSL being used... Newsgroups: netscape.public.mozilla.crypto Date: 01 Jul 2005 14:53:49 -0600Anne & Lynn Wheeler writes:
containing little more than some type of database lookup value (like account number) and the public key (as a way of dealing with the significant privacy and liability issues that go along with x.509 identity certificates containing enormous amounts of personal information).
part of the issue is that most business processes have well-established and long entrenched relationship management infrastructures ... that contains detailed and real-time information about the parties that they are dealing with. in such environments it was trivial to show that the relying-party-only certificates (indexing an online relationship management infrastructure containing the real information) were redundant and superfluous.
in fact, stale, static digital certificates of nearly any kind become redundant and superfluous when the business process has to deal with an established online, real-time relationship management infrastructure.
the target for digital certificates, PKIs, etc ... where the offline relying parties involved in first-time communication with total strangers where they had no recourse to information about the party they were dealing with (sort of the letters-of-credit model from the sailing ship days).
as the internet becomes more ubiquitous, the offline market segment is rapidly disappearing. there has been some shift by PKI operations into the no-value market segment ... where the relying party can't justify the cost of an online transaction when first time interaction with strangers are involved. However, as internet becomes more and more ubiquitous, the cost of using the internet for online operations is also rapidly dropping ... creating an enormous squeeze on even the no-value market segments.
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: More Phishing scams, still no SSL being used... Newsgroups: netscape.public.mozilla.crypto Date: 01 Jul 2005 15:43:00 -0600oh, what the heck ... a little more fun with walk down memory lane (Anne & I even get honorable mention)
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: More Phishing scams, still no SSL being used... Newsgroups: netscape.public.mozilla.crypto Date: 02 Jul 2005 12:22:57 -0600Anne & Lynn Wheeler writes:
is that their "light" digital certificates are also referred to has
relying-party-only certificates
https://www.garlic.com/~lynn/subpubkey.html#rpo
... since rather than carrying the actual information, they just carry some sort of index pointer into a business relationship management infrastructure (like account number). the relationship management infrastructure contains all the real information.
however, it is trivial to show that such "light" certificates are redundant and superfluous when the business process has to access the infrastructure containing the real information. this is also somewhat a trivial logic operation when you take the original design point for digital certificates is providing relying parties with information in an offline environment when the relying parties had no other recourse to the real information; aka therefor by definition, if the relying parties have access to the real information ... the original purpose and justification for digital certificates is invalidated.
the other issue is that even for "light" certificates the infrastructure overhead for appending certificates ran 4k to 12k bytes. when you are talking about the basic payment card infrastructure where typical message size is 60-80 bytes, the appended certificate paradigm represents an enormous payload bloat of one hundred times (two orders of magnitude) ... for otherwise redundant and superfluous certificates.
the basic technology is asymmetric key cryptography, where what one key (of a key-pair) encodes, the other of the key-pair decodes.
a business process has been defined called
public key
... where one of
the key-pair is made freely available and the other key is identified
as private and kept confidential and never divulged.
a further business process has been defined called
digital signature
... which represents something you have authentication ... from
3-factor authentication paradigm
https://www.garlic.com/~lynn/subintegrity.html#3factor
• something you have
• something you know
• something you are
digital signatures implies that some entity has access and (presumably
sole) use of a specific private key.
existing relationship management infrastructures can upgrade their
shared-secret based authentication
https://www.garlic.com/~lynn/subintegrity.html#secret
to digital signature, by registering a public key in lieu of pin, password, ssn, date-of-birth, mother's maiden name, etc. while in secret-based infrastructures, the same value is used to both originate as well as authenticate. in public key scenario for digital signature, the public key is only used to authenticate (and can't be used to originate or impersonate).
from the PAIN security acronym
P ... privacy (or sometimes CAIN, confidential)
A ... authenticate
I ... integrity
N ... non-repudiation
it can be easily demonstrated that relationship management
infrastructures tend to have very high integrity reguirements
(regarding all details of the relationship as well as the
authentication information).
however, many business infrastructures that make heavy use of their relationship management infrastructure for numerous business process are also at risk of exposing the authentication information. when this authentication information is public key, it tends to not be a big deal. however, when the authenticaton material are secrets, then there is an enormous privacy requirement (since obtaining the secrets also enables impersonation, fraudulent transactions, account fraud, etc).
Using secret-based authentication can create enormous dynamically opposing objectives for relationship management infrastructure ... on one hand the relationship management infrastructure has to be readily available in support of numerous business operations .... and on the other hand, the secret-based privacy requirements are none but extremely constrained business operations can access the information.
one such description is my old security proportional to risk posting
https://www.garlic.com/~lynn/2001h.html#63
and a whole host of postings on skimming and harvesting of
(secret-based) authentication material (that can be leveraged
to perform fraudulent transactions)
https://www.garlic.com/~lynn/subintegrity.html#harvest
as an aside ... there was a report within the past couple years that something like 70 percent of identity/account fraud involved insiders. there has been some additional, similar reports. there were a number of these news URLs in the past couple days:
Bank workers biggest ID theft threat; Insiders with access to data may
pose 70% to 80% of risk
http://deseretnews.com/dn/view/0,1249,600145529,00.html
Banks Face Challenge In Screening Employees To Avert Inside ID Thefts
http://www.banktech.com/aml/showArticle.jhtml?articleID=164904297
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: More Phishing scams, still no SSL being used... Newsgroups: netscape.public.mozilla.crypto Date: 02 Jul 2005 13:29:32 -0600Ram A Moskovitz writes:
however, it is trivial to show that if the relying-party is going to access some form of relationship management infrastructure containing all the real information, then any stale, static digital certificates are redundant and superfluous.
the issue of PKIs moving into no-value transactions .... is that a relationship management infrastructure typically contains a lot more timely and higher quality information for making decisions. If the infrastructure can justify the value of having higher quality online information ... then the PKIs and stale, static digital certificates are redundant and superfluous. That leaves PKIs looking for the rapidly shrinking markets where 1) the relying party can't access the real information directly (restricted to offline environment) and/or 2) the relying party can't justify the need to have direct and timely, higher quality information.
in the mid-90s, FSTC
http://www.fstc.org/
was in a quandary over FAST ... basically doing simple digitally
signed transactions but expanding them to issue more than financial
transactions. there are some implied reference to that opportunity
in the old news posting
https://www.garlic.com/~lynn/2005l.html#34 More Phishing scams, still no SSL being used
in the x9a10 financial standard working group, we were charged with
preserving the integrity of the financial infrastructure for all
retail payments. the x9.59 standard was the result
https://www.garlic.com/~lynn/x959.html#x959
it basically is the minimum payload increase to existing payment
messages. it can be mapped to iso 8583 debit, credit, and stored-value
messages
https://www.garlic.com/~lynn/8583flow.htm
with the addition of a couple additional minimal fields and a digital signature (and no enormous payload bloat by appending stale, static, redundant and superfluous digital certificates).
The FAST scenario was basically to enable the asking of yes/no questions about things other than financial transactions (i.e. the title of the referenced news article: Privacy Broker: Likely Internet Role for Banks?).
For instance a merchant could ask if the person was of legal drinking age. There was no requirement to divulge the actual birthdate (birthdates are widely used as a means of authentication, so divulging birthdates represents an identity fraud threat).
The FSTC/FAST scenario was that there is a large and thriving internet
business for age verification .... but it involved a segment of the
internet business that many consider unsavory. However, the widespread
deployed implementation was based on an intermediary doing a
"$1 auth"
credit card transaction as part of registration. The
"$1 auth"
would
never be settled, so there was never any actual credit card charge
(although your
credit limit
or
open to buy
would be decremented by a
dollar for a couple days until the auth had expired). The theory was
that a person had to be of legal age to sign a credit card contract,
which in turn enabled them to do credit card transactions. There was a
lot of money being made off of this
"$1 auth"
hack ... and only a very
small amount going to the financial industry. Note however, FAST was
never intended to only be limited to age verification ... but age
verification was viewed as an already well-established market.
When we were called in to work on the cal. state and federal electronic signature legislation, one of the industry groups had done studies on the driving factors behind privacy regulation and legislation. The two main driving factors were 1) identity theft and 2) (institutional) denial of server; aka the primary driving factors weren't privacy itself ... it was the prospect of fraud and/or being denied a job or various kinds types of services.
so as implied in the previous post
https://www.garlic.com/~lynn/2005l.html#35
and the post on security proportional to risk
https://www.garlic.com/~lynn/2001h.html#63
many relationship management infrastructures have strongly conflicting business confidentiality objectives ... readily available for use by lots of business processes and at the same time not being available hardly at all because there is authentication information that can also be used to impersonate and originate fraudulent transactions.
harvesting of such repositories is frequently made easier because of the large number of different business processes that require access to the information (in some cases the transaction information even defines the authentication information).
going back to the secruity PAIN acronym
P ... privacy (or somethings CAIN, confidential)
A ... authentication
I ... integrity
N ... non-repudiation
the businesses tend to have a strong integrity requirement for their
relationship management systems (various kinds of integrity issues
like introduction of incorrect values can affect their bottom line).
However, they tend to have a much lower privacy requirement for
their relationship management systems (in part because a large number
of different business processes require access).
When an insider swipes the information, they tend to go far away to do their account/identity fraud.
I'm also a co-author of the x9.99 financial industry privacy impact
assessment (PIA) standard. Most companies understand using security
(and frequently integrity) to protect themselves. However, it
frequently takes a change in mindset to start using security (and
frequently privacy) in the protection of others. minor note ... as
part of x9.99, i also started a privacy taxonomy and glossary (trying
to help organize how you think about privacy):
https://www.garlic.com/~lynn/index.html#glosnote
One of the issues with the posting on security proportional to risk ... is that even if you blanketed the earth under miles of cryptography, the current infrastructure still can leak information that can be used in account and identity fraud.
One of the things in the x9.59 standard
https://www.garlic.com/~lynn/x959.html#x959
was that it removed knowledge of an account number as point of compromise. Given that the account number is used in an enormous number of business processes ... trying to keep it confidential appears to be an impossible task. so x9.59 changed the rules, it made the account number useless to crooks for performing fraudulent transactions:
1) x9.59 transactions had to be strongly authentication
2) account numbers used in x9.59 transactions could not be
used in non-authenticated transactions
aka gave up on trying to keep the account number confidential ... just made knowledge of the account number useless to crooks for account/identity fraud.
misc. related postings
https://www.garlic.com/~lynn/aadsm6.htm#terror7 [FYI] Did Encryption Empower These Terrorists?
https://www.garlic.com/~lynn/aadsm6.htm#terror13 [FYI] Did Encryption Empower These Terrorists?
https://www.garlic.com/~lynn/aadsm8.htm#3dvulner 3D Secure Vulnerabilities?
https://www.garlic.com/~lynn/aadsm8.htm#softpki16 DNSSEC (RE: Software for PKI)
https://www.garlic.com/~lynn/aepay11.htm#66 Confusing Authentication and Identiification?
https://www.garlic.com/~lynn/aadsm14.htm#4 Who's afraid of Mallory Wolf?
https://www.garlic.com/~lynn/aadsm15.htm#27 SSL, client certs, and MITM (was WYTM?)
https://www.garlic.com/~lynn/aadsm16.htm#20 Ousourced Trust (was Re: Difference between TCPA-Hardware and a smart card and something else before
https://www.garlic.com/~lynn/aadsm17.htm#41 Yahoo releases internet standard draft for using DNS as public key server
https://www.garlic.com/~lynn/aadsm18.htm#29 EMV cards as identity cards
https://www.garlic.com/~lynn/aadsm19.htm#39 massive data theft at MasterCard processor
https://www.garlic.com/~lynn/aadsm19.htm#40 massive data theft at MasterCard processor
https://www.garlic.com/~lynn/2000g.html#41 Egghead cracked, MS IIS again
https://www.garlic.com/~lynn/2001f.html#24 Question about credit card number
https://www.garlic.com/~lynn/2002j.html#14 Symmetric-Key Credit Card Protocol on Web Site
https://www.garlic.com/~lynn/2002n.html#14 So how does it work... (public/private key)
https://www.garlic.com/~lynn/2003k.html#66 Digital signature and Digital Certificate
https://www.garlic.com/~lynn/2004b.html#25 Who is the most likely to use PK?
https://www.garlic.com/~lynn/2004i.html#5 New Method for Authenticated Public Key Exchange without Digital Certificates
https://www.garlic.com/~lynn/2004m.html#9 REVIEW: "Biometrics for Network Security", Paul Reid
https://www.garlic.com/~lynn/2005k.html#26 More on garbage
https://www.garlic.com/~lynn/2005l.html#22 The Worth of Verisign's Brand
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: More Phishing scams, still no SSL being used... Newsgroups: netscape.public.mozilla.crypto Date: 02 Jul 2005 14:54:36 -0600Ram A Moskovitz writes:
mentioned in the previous post
https://www.garlic.com/~lynn/2005l.html#36 More Phishing scams, still no SSL being used
instead of the end-user sending the merchant a digitally signed
x9.59 transaction mapped into standard iso 8583 message network
https://www.garlic.com/~lynn/8583flow.htm
which the relying party then sends off and gets an answer back from the authoritative agency (in the case of financial transaction, whether the merchant will be paid or not) ... a very similarly formated transaction of the same size and shape is sent off to ask any of possibly dozens of questions.
furthermore, there is NO attached redundant and superfluous digital certificate that results in two orders of magnitude payload bloat.
another way of looking at it ... is rather than having a large PKI infrastructure targeted at efficiently providing information in a no-value and/or offline environment ... and then layering the overhead of an online transaction infrastructure over it .... there is just the overhead of the online transaction infrastructure.
So the FAST scenario has at least all the transaction efficiencies of OCSP ... w/o any of the heavy duty, extraneous, redundant and superfluous burden of PKIs and digital certificates.
The other way of looking at it ... was that OCSP was trying to emulate the online transaction efficiencies of FAST, but trying to maintain the facade that the stale, static, redundant and superfluous PKI digital certificates were in anyway useful (for such an online environment).
To meet that requirement (maintaining the fiction that digital certificates were useful in such environments), OCSP was limiting itself to a transaction about whether the information in a specific stale, static, redundant and superfluous digital certificate was still valid. The FAST scenario just did a highly efficient, straight-through processing, digitally signed transaction to get a reply about the actual information (somewhat riding the existing rails that provides highly efficient straight through processing for performing payment transaction). If OCSP starting expanding its horizion and asking real live questions (aka turn into something more akin to FAST) ... then it would become more readily apparent that the stale, static, redundant and superfluous digital certificates weren't serving any useful purpose.
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: IBM/Watson autobiography--thoughts on? Newsgroups: alt.folklore.computers Date: 03 Jul 2005 20:45:08 -0600fairwater@gmail.com (Derek Lyons) writes:
travelling combines started the season early in texas and moved north as the wheat ripen. (at least back then) they would harvest wheat for 2-3 bushels per acre, haul it into farmer's co-op. the farmer's co-op carried farmers on credit all year ... and if there was enuf wheat left over after the combine crew got its cut ... they could settle the bills. this is mid-west dry land wheat, a really good year might yield 6 bushels/acre (before the combine crew cut) ... not the famous stuff from eastern washington where they've posted records of 90-100 bushels/acre.
doing quicky web search ... there is short mention of the traveling
combines that started in texas and moved north to montana ... see
bottom of this page ... harvest brigade
http://www.hutchnews.com/past/06-27-2004/adastra/adastra2.html
slightly more detailed description of the harvest route
http://www.skinnerharvesting.com/
http://www.ckfigginsharvesting.com/
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: Safe to transmit (symmetric) key encrypted with itself? Newsgroups: sci.crypt Date: 04 Jul 2005 08:38:51 -0600or maybe it is still somewhat at this stage:
furthermore i believe there have been one or two discovery programs about disciplines mandating what is the correct orthodoxy (which later turned out to be wrong). it is possibly you are comparing two disciplines that are at completely different maturity levels.
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: IBM/Watson autobiography--thoughts on? Newsgroups: alt.folklore.computers Date: 04 Jul 2005 09:16:59 -0600jmfbahciv writes:
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: 25% Pageds utilization on 3390-09? Newsgroups: bit.listserv.ibm-main Date: 05 Jul 2005 09:12:58 -0600martin_packer@ibm-main.lst (Martin Packer) writes:
the difference between paging and log structured file system ... is that the page data tended to be ephemeral ... i.e. the data on disk was discarded when a big page was fetched back to memory (on page fault). a log structured file system involved persistant data and would periodically compact scattered regions on disk.
the issue was that over a 10-15 year period, relative system disk access performance had declined by a factor of ten times (that is memory and cpu got 50 times bigger & faster, while disk access thruput only got 3-5 times faster). The issue in the 3081/3380 time-frame was that while the 3380 disk access performance had only increased by a factor of maybe four times ... 3380 disk transfer speed had increased by a factor of ten times (and real memory had increased by factor of maybe 50 times). The result was that there was a relative over abundance of disk transfer capacity and real memory compared to disk access thruput (and total disk space capacity had also enormously increase).
The issue was how to trade-off the enormous amount of disk space capacity, and the relatively large amounts of disk transfer capacity and real storage against the scarce bottleneck disk arm access resource.
big-pages with moving cursor ... did ten 4k page writes for every arm access and attempted to drastically minimize the expected arm travel distance (compared to single page at a time transfer). with very sparse allocation (trading disk space resources against disk arm access scarcity), multiple big page writes might be performed on the same cylinder w/o arm motion.
on any 4k page-fault ... all (ten 4k pages) pf a big page was brought back into memory. compared to a 4k page at a time page fault strategy, it might bring in 2-3 more pages than the application would eventually need. however, it would likely avoid at least 5-6 page transfers compared to a 4k page at a time strategy (since the pages had already been brought in). The trade-off was that on fetch, a big page might unnecessarily transfer 2-3 pages (disk transfer resource) and occupty 2-3 pages of additional real memory (real memory resource) at the savings of 5-6 arm accesses. hopefully the number of such additional page transfers were minimized ... but even if they weren't, reducing ten arm access to one more than offset any 8-12k increase in amount of data transferred and/or 8-12k increase in real storage needed.
another trade-off was that most single-page algorithms tended to preserve home position when a page was read into memory. while the page was in memory ... a copy existed on both disk and in real storage. when the page was selected for replacement, if the page hadn't been changed during the most recent stay ... the page write could be avoided (just keeping the existing copy on disk). in the big page scenario ... the sense of existing copy on disk was discarded (in part because its arm position might not bear any relationship to the arm position when the page was to be removed from storage). As a result, the number of bytes written out associated with page replacements somewhat increased (compared to single page at a time with home position) at a savings of the number of disk arm accesses.
at the introduction of 3380 there was some discussion about the enormous increase in 3380 disk space capacity. if you completely filled a 3380, at 4k bytes transfer/access .... that accesses per second per byte was lower than a 3350 i.e. the 3380 arm access per second was higher than 3350 arm access per second ... but the increase in 3380 disk space capacity was even larger.
There was a discussion at SCIDS regarding recommendations to datacenter management that 3380s only be filled to less than 80 percent capacity to maintain thruput equivalence (between full 3350 and full 3380). The problem was that datacenter management tended to account for disk space cylinders but not necessarily overall system thruput (based on bottleneck of available disk accesses per second). The recommendata was a "fast" 3380 should be announced ... that had a controller microcode load that reduced the number of available 3380 cylinders ... and that the price of the "fast" 3380 should be higher than the price of a regular 3380 (with no reduction in cylinders). The SCIDS discussion was that this was possibly the only way to convince most datacenter management of the benefits of managing the disk access per second per byte throughput bottleneck (make them pay more for a hardware enforced feature that they could otherwise achive thru simple administrative policy).
when I first started making statements and writing about the drastic deline in relative system thruput of disk arm accesses, GPD management assigned their performance modeling group to refute the claims. after some period, they came back with the conclusion that I had slightly understated the reduction in disk relative system thruput. this was turned around and made a SHARE presentation on how to optimize for the (unique?) 3380 performance characteristics.
misc. past posts about observing that relative system disk
access thruput had declined by at least ten times over a period
of years.
https://www.garlic.com/~lynn/93.html#31 Big I/O or Kicking the Mainframe out the Door
https://www.garlic.com/~lynn/94.html#43 Bloat, elegance, simplicity and other irrelevant concepts
https://www.garlic.com/~lynn/94.html#55 How Do the Old Mainframes Compare to Today's Micros?
https://www.garlic.com/~lynn/95.html#10 Virtual Memory (A return to the past?)
https://www.garlic.com/~lynn/98.html#46 The god old days(???)
https://www.garlic.com/~lynn/99.html#4 IBM S/360
https://www.garlic.com/~lynn/99.html#112 OS/360 names and error codes (was: Humorous and/or Interesting Opcodes)
https://www.garlic.com/~lynn/2001d.html#66 Pentium 4 Prefetch engine?
https://www.garlic.com/~lynn/2001f.html#62 any 70's era supercomputers that ran as slow as today's supercomputers?
https://www.garlic.com/~lynn/2001f.html#68 Q: Merced a flop or not?
https://www.garlic.com/~lynn/2001l.html#40 MVS History (all parts)
https://www.garlic.com/~lynn/2001l.html#61 MVS History (all parts)
https://www.garlic.com/~lynn/2001m.html#23 Smallest Storage Capacity Hard Disk?
https://www.garlic.com/~lynn/2002b.html#11 Microcode? (& index searching)
https://www.garlic.com/~lynn/2002b.html#20 index searching
https://www.garlic.com/~lynn/2002e.html#8 What are some impressive page rates?
https://www.garlic.com/~lynn/2002e.html#9 What are some impressive page rates?
https://www.garlic.com/~lynn/2002.html#5 index searching
https://www.garlic.com/~lynn/2002i.html#16 AS/400 and MVS - clarification please
https://www.garlic.com/~lynn/2003i.html#33 Fix the shuttle or fly it unmanned
https://www.garlic.com/~lynn/2004n.html#22 Shipwrecks
https://www.garlic.com/~lynn/2004p.html#39 100% CPU is not always bad
https://www.garlic.com/~lynn/2005h.html#13 Today's mainframe--anything to new?
https://www.garlic.com/~lynn/2005k.html#53 Performance and Capacity Planning
misc. past big page posts
https://www.garlic.com/~lynn/2001k.html#60 Defrag in linux? - Newbie question
https://www.garlic.com/~lynn/2002b.html#20 index searching
https://www.garlic.com/~lynn/2002c.html#29 Page size (was: VAX, M68K complex instructions)
https://www.garlic.com/~lynn/2002c.html#48 Swapper was Re: History of Login Names
https://www.garlic.com/~lynn/2002e.html#8 What are some impressive page rates?
https://www.garlic.com/~lynn/2002e.html#11 What are some impressive page rates?
https://www.garlic.com/~lynn/2002f.html#20 Blade architectures
https://www.garlic.com/~lynn/2002l.html#36 Do any architectures use instruction count instead of timer
https://www.garlic.com/~lynn/2002m.html#4 Handling variable page sizes?
https://www.garlic.com/~lynn/2003b.html#69 Disk drives as commodities. Was Re: Yamhill
https://www.garlic.com/~lynn/2003d.html#21 PDP10 and RISC
https://www.garlic.com/~lynn/2003f.html#5 Alpha performance, why?
https://www.garlic.com/~lynn/2003f.html#9 Alpha performance, why?
https://www.garlic.com/~lynn/2003f.html#16 Alpha performance, why?
https://www.garlic.com/~lynn/2003f.html#48 Alpha performance, why?
https://www.garlic.com/~lynn/2003g.html#12 Page Table - per OS/Process
https://www.garlic.com/~lynn/2003o.html#61 1teraflops cell processor possible?
https://www.garlic.com/~lynn/2003o.html#62 1teraflops cell processor possible?
https://www.garlic.com/~lynn/2004e.html#16 Paging query - progress
https://www.garlic.com/~lynn/2004.html#13 Holee shit! 30 years ago!
https://www.garlic.com/~lynn/2004n.html#22 Shipwrecks
https://www.garlic.com/~lynn/2004p.html#39 100% CPU is not always bad
https://www.garlic.com/~lynn/2005h.html#15 Exceptions at basic block boundaries
https://www.garlic.com/~lynn/2005j.html#51 Q ALLOC PAGE vs. CP Q ALLOC vs ESAMAP
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
From: Anne & Lynn Wheeler <lynn@garlic.com> Subject: Re: More Phishing scams, still no SSL being used... Newsgroups: netscape.public.mozilla.crypto Date: 05 Jul 2005 10:26:11 -0600pgut001@cs.auckland.ac.nz (Peter Gutmann) writes:
from the mid-90s .... but it makes having the digital certificate redundant and superfluous (aka you don't need to have a digital certificate to do a real live transaction) ... which i have repeatedly commented to the PKIX and OCSP factions (and may have been part of the reason for their violent reaction to you suggestion).
in '98 i was on a "PKI" panel at nissc conference with four other people ... three representing the major CAs and one other person.
the people representing three major CAs (CTO typically) talked about how hard everybody has heard that PKIs were ... and they were here to tell you that it is much, much simpler than you have heard.
I then talked about the majority of the business processes in the world can be upgraded to digital signature authentication w/o requiring digital certificates.
the fifth person then talked about being responsible for the largest and longest deployed PKI operation ... and people may have heard about how hard PKIs were, and they were here to tell you that PKIs are actually much, much harder than anything you have heard.
--
Anne & Lynn Wheeler | https://www.garlic.com/~lynn/
previous, next, index - home