Reduce Password Proliferation.. OAuth 2.0 is your friend.. - Part 2
- By Santosh Subramanian
- •
- 17 Jul, 2018
- •
This is a continuation from the Part 1 of my post in the same topic. I would suggest reading the previous part before embarking on this post.
I had a good read of your previous post.. Now, take me
through how different is the OAuth 2 route as against the 'Password
Anti-Pattern' Route..
Alright...Taking the example of the Mail Aggregation website (KoolAgg) we discussed earlier,
- The Resource Owner (the end-user) delegates the authority to the Client (eg: KoolAgg) to make API calls to the Resource server (GMail Mail Server).
- KoolAgg (assuming its registered with Google as mentioned in the pre-requisites), sends an authentication request passing its ClientID, Scope required and a RedirectURL. The user is presented with a page from the Google Server requesting grant of access (User Consent) and in the event of a Grant from the user, redirects request back to the browser with the “Authorisation Code”.
- The client again sends a fresh request with the “Authorisation Code” along with its ClientID and RedirectURL to the Google Auth server to exchange it for an “Access Token”.
- The Google Auth Server on successful validation sends back a response to the RedirectURL with an “Access Token” and optionally a “Refresh Token”.
- The client then calls the Google mail API in the resource server armed with the “Access Token” to retrieve the emails from Google which it can aggregate after performing a similar operation on the Yahoo Server too.
Great.. I am now getting a hang.. That’s a lot of Token types going around isn’t it? A bit more detail on these and the other terminologies might be of help.
The “Access Token” is used by clients to make authenticated requests on behalf of the Resource Owner to the Resource Server. The Access Token can grant access to multiple APIs (which boils into features). The set of features that is permitted to be granted access is controlled by a parameter called “Scope”. The value and meaning the scope parameter is defined by the Authorisation Server. This parameter needs to be sent with one or more values as part of the Access Token request sent to the Authorisation Server. Access Tokens are valid only for the set of operations and resources as set in the scope of the token request. This is enforced by the Resource Server. This is the reason the Access Token is often equated to a “Valet Key” which only allows you to perform a limited set of functions (scope) and is time bound (access duration) unlike a full key.
Access tokens have a definitive lifetime that is often shortlived, which means that if the Access Token expires, the user would need to perform all the procedures that he did earlier for the access token to be provided. This can affect the user experiance. At the same time it is not safe to have a long lifetime for Access Token since in the event of a compromise, it can lead to a higher exposure. To arrive at a balance between the security and usability aspects, Refresh Tokens were introduced. Hence if access to the API is needed for more than the lifetime of a single access token, a Refresh Token can be used.
Refresh tokens are tokens used by client to obtain new Access Token without having to involve the Resource Owner. In most implementations Access Tokens cannot be revoked (hence the short life) but the Refresh Tokens can be revoked. Refresh Tokens tend to have a longer lifetime than Access Token. Similar to the Access Token, the user would need to perform the authentication process again when the Refresh Token expires. Hence it is a good practice from a Security perspective to set an appropriate expiry time for the Refresh Token.
When an Access token expires, the application can provide the Refresh Token to the application server and recieve another set of fresh Access token and Refresh Token to use the next time. This provides the option to implement any changes in the Access Token policy (eg: Altering its lifetime or Scope) transparent to the application.
Brilliant.. How about the same usage on mobile devices etc ??
In the case that our aggregation service is an app on your mobile
phone, then a browser window would pop-up for the Resource Owner to
provide the Grant (User Consent). The OAuth token would be stored in the
mobile phone to prevent the need for the user to perform the granting
every time the app is opened. Also, it can eliminate the need for
storing passwords on the mobile phone. In the event that the mobile
phone is stolen, user could simply revoke permission for that app on the
mobile without affecting other means of access.
I think i’ve
touched upon most of the important aspects of OAuth 2.0 and covered the
"Authorisation Code" access pattern (which is by far the most common and
also the most secure) in detail. OAuth 2.0 also supports other types of
access patterns which are also called "Grant Types" (eg: Implicit
Grant, Resource owner credential, Client credential etc.) which have not
been covered in this post for the sake of brevity. That is for a
different day.
Further details on OAuth 2.0 specification can be found here.
#KillPassword

Disclaimer: The opinions expressed in this article are my own & doesn’t reflect that of organisations current or past I have been involved with.
Open Banking / PSD2 - What is that??
Open Banking came to effect on the 13th Jan 2018. It was mandated by the UK's Competition and Markets Authority (CMA) in 2016 to the 9 biggest banks in the UK (HSBC, Barclays, RBS, Santander, Bank of Ireland, Allied Irish Bank, Danske, Lloyds and Nationwide) to have the capability to expose the information they hold about their customer in their bank accounts and to provide provision to instruct payments through APIs in a secure and standardised way for other entities like Challenger banks or Fintechs (referred to in the standards as Third Party Providers - TPPs ).
APIs or Application Programming Interface is a mechanism by which one or more IT systems make use of the service offered by another in a pre-defined and standard way. For example, it is the way Uber makes use of Google Maps in its App.
This was aimed at levelling the playing field and promoting innovative products and services for the consumers. For example, the transaction history stored in a customer's bank account is a treasure trove of information like what are their earnings, how much they spend on utilities / Grocery / Insurance / Mortgage etc. This information can be utilised by TPPs to provide meaningful analysis or the TPPs can act as service providers to pay for goods and services. As an example, if you are a multi-bank customer, the TPP could provide a consolidated view of your finances across your different bank accounts.
Though Open Banking superficially looks more like its directed against the big banks, nothing stops the traditional banks from turning the tide by utilising the same capabilities to launch innovative offerings themselves or by partnering with / acquiring other FinTechs.
If that got you concerned about the security and privacy impact, there are 2 aspects to consider. Firstly the TPPs providing these services need to undergo rigorous security and compliance process to be licensed and are regulated by the FCA (Financial Conduct Authority).Apart from this they also need to make sure that they have the proper process and procedures to handle one or more of the likes of GDPR, PCI-DSS etc to name a few.
Secondly, access to customer's data would need informed and explicit consent from the account holder (which can also be revoked at a later point in time) with stronger authentication to prove the higher level of assurance. The consent aspect also has a tie-in with EU GDPR Directive. Hence privacy and security are key underlying requirements as part of this directive.
Opening up the account data and to provide the ability to perform payments also forms part of the EU's 2nd Payment Service Directive (PSD2) which is the latest European legislation on payment standards. One of the key aims for Open Banking was to be compliant with the PSD2 regulation. Interestingly, though the initial scope for Open Banking was only to cover the Personal and Business Current Accounts, the latest change in scope (as of Nov 2017) aligns it closer to the PSD2 scope. i.e cover all payments accounts covered by PSD2 including savings accounts, credit cards, loans, mortgage and even multi-currency accounts. All these additional elements added to the existing scope would start to be implemented in a phased manner running until Aug 2019.
As mentioned earlier, apart from the possibility of account aggregation, PSD2 / Open Banking brings in the possibility for another capability that can be exploited by TPPs - instructing payments on your behalf. So, payments can be made through likes of Facebook messenger / Whatsapp. A more creative application to this might be AI based financial services that can automatically sense that your current account is approaching the minimal threshold balance you had set and automatically move money out of another account of yours held in a different bank into this account... provided you have consented the TPP for this.
The PSD2 directive identifies two types of TPPs - Account Information Service Providers (AISP) and Payment Initiation Service Providers (PISP). AISPs are providers of service that can access customer's bank account on behalf of them and retrieve information for which they have been consented by the customer. Alternatively, PISPs are TPPs that can initiate payment transactions on the customer's bank account for which they have been provided consent. Please note that both AISP and PISP are distinctly defined roles. i.e one cannot do what the other can. Though the same TPP can be registered and licensed to act as both AISP and PISP.
So, What is in it for the Retailers
In the current world, when a purchase is made on the Retailer's website using credit/debit card, the retailer's transaction has to go through a series of intermediaries ranging from the acquirer to the card scheme provider etc. A charge is added to the transaction directly or indirectly at every stage. The overall additional charge added is usually in the region of up to 2 - 2.5% of the transaction value in total. With most retailers not able to pass this surcharge to the customer, guess who picks the bill??
In today’s world, content sharing and content aggregation is part and parcel of our personal life as well as in how business is done. This ranges from the provision to share an article or news item to your social media feeds (Facebook/LinkedIn) from the content source itself or aggregating all your financial information into a mashup service to have a consolidated view or reducing the registration requirements for service providers like Stackoverflow / SlideShare etc by using your Google / Facebook credentials. This would mean a need for mechanism to share information and resource between multiple parties in a reliable and, most importantly, in a secure fashion.
Before we go deeper into how OAuth 2 handles the above requirements, to aid understanding the OAuth related concepts let's make use of a fictional cool website based service provider that could aggregate your different mails from different providers and provide you a consolidated view with customised interfaces and tagging etc. Let’s call this 'KoolAgg'. Let’s assume that you would want to aggregate your Yahoo mail and Gmail accounts into this new service provider.
Sounds like a cool service indeed so what would I have needed to do to use it if KoolAgg existed in real?
In the days before the emergence of OAuth, you would have needed to provide your Yahoo and Google credentials to this service provider who would use it for logging on to Yahoo Mail and Gmail impersonating you programmatically and retrieve your mails (before aggregating and providing it to you). Sounds perfect, isn’t it? But then, the below issues began to creep in.
- Remember, this website in effect has your credentials stored in their internal data stores. A compromise of this website would end up revealing all your credentials to the hackers.
- This website uses your password every time it needs to aggregate your email with the individual email providers which is bound to happen multiple times over the day depending on the frequency of the sync. This means that for every synchronisation request, each of your configured email credentials travel between the two parties over the internet, increasing the possibility of your password being intercepted.
- Your GMail or Yahoo mail credentials can be used to access more than just your email. For example, your Google credentials can also provide access your Google Photos (Picasa) albums too. Hence there is no way to granularly provide access to your account like what can be accessed or for how long.
- It is not possible to revoke access for this specific service provider to your account at a later date unless you change your account's password. This might break all other service providers that you might have configured and those that rely on your account details.
- Apart from the above case, if you needed to change your account password for any other reasons, you would need to change the details in all the aggregating services for them to provide you the service that is reliant on your account details.
Using password based approach reduces the secure aspect of this resource / information sharing drastically so much so that this approach is coined as the " Password Anti-Pattern" (please see here ).
That does not sound good.. Wish there was a better way.
Yes. There is..To avoid the issues associated with the Password Anti-pattern , it is essential that access to third-party data should require authentication at the respective third-party site. i.e access to your Facebook data should be authorised in facebook.com (which is responsible for storing and securing it) and not in the aggregating / mash-up website.
OAuth 2.0 comes to the rescue to handles all of the above issues. OAuth 2.0 allows users to grant permissions for third-party to access their resources without the need to share password. The access to these resources can be limited by time ( how long ), scope ( which sections of the data ) and Action ( read/update ). Another significant advantage of this approach is that, since the authorisation is provided at the site hosting the resource, it would enable the resource host (eg: facebook) to provide the necessary information about all the services that have been authorised and also the scope of authorisation in a single dashboard. This helps you to audit the permissions at a later date. Most importantly, the authorisation is done by way of token exchange instead of passwords.
Before we go further, a bit of a background… Work on OAuth standard started in the early 2006 when Facebook, Yahoo, AOL and Google decided to come together to collaborate on an API security standard (each of them had their own standards before). It was first released around early 2007 after which It underwent a lot of transformations before the latest version OAuth 2.0 emerged in the current shape on October 2012. It is important to note that OAuth 2.0 is not backward compatible with its previous version. Any further mention of OAuth in this post henceforth would mean OAuth 2.0.
Before we go into the details of this, let us understand the roles defined by OAuth 2.0:
- Resource owner: This is an entity capable of granting access to a protected resource. In our example KoolAgg case, it’s the end user. The resource owner also authorises the scope of authorisation.
- Client: This is the application that is obtaining authorisation (eg: KoolAgg) and making protected resource request on behalf of the resource owner (the end user). The client in the case of a mobile based solution can also be a mobile app.
- Resource server (RS): This is the server hosting protected resources (eg: Gmail in our example).
- Authorization server (AS): This is a server capable of authenticating the resource owners, obtaining authorization, and issuing tokens. Google’s Auth Server as per our example.
The client "gets a security token" from Authorization Server based on the Resource Owner's Authorization and the client "uses the security token" on the Resource Server. This is how the relationship between the above four roles are tied up.
The Registration process
The pre-requisite before offering OAuth 2.0 based access is that the client application (the KoolAgg application in our example) has to register with the OAuth Providers it needs to support. This registration process needs to be performed by the client application’s developer to establish relationship between the application and the service provider (eg: Google / Yahoo in our example). This would in most cases provide the below parameters that are key for using the service:
- ClientID - This identifies the consumer of the service (eg: KoolAgg as in our example) to the service provider (say Google)
- ClientSecret - The secret used by the consumer to prove its identity to the Service Provider.
- RedirectURL (Redirect Endpoint) -Where the service will redirect the user after they make an authorisation decision (in most cases provided by Client to Service Provider)
- AuthorisationURL (Authorisation Endpoint) – Where the client would send to initiate authorisation flow (Provided by the Service Provider to the Client)
- TokenURL (Token Endpoint) – Used by client to initiate token flow (Provided by Service Provider to the Client)
The high level flows in OAuth 2.0 :
- Client obtaining the authorisation grant from Resource Owner.
- Client making a call to the Authorisation Server (AS) to exchange the authorisation grant for an Access Token.
- Client accessing the protected resources at the Resource Server after presenting the Access Token.
It is recomended that all communications between the client and the service provider are TLS secured in both directions.
Phew!!! That’s a lot of technical stuff to take at a time.. I will read it up again on my commute back home later..
Not a bad idea.. We shall continue our discussions on part 2 of this post.
#KillPassword

In the first part of this post we saw why using password is not secure in the current climate and how you could add additional factors to make it more secure. In this post we will continue on the the path to getting closer to the " Password Nirvana ".. The quest towards eliminating the use of passwords from our digital life...
Can we do anything about eliminating the "password" element in Authentication?
A recent survey of users has found that a third of people now admit to having grown angry after struggling to remember log-in details. Users in the survey say " forgetting passwords to an account they need immediate access to is more annoying than misplacing their keys, having a cellphone battery die, or receiving spam email.
" Hence it would be ideal if we could implement an authentication system where password is not used as an authenticating factor at all as passwords have one of the least level of assurance.There are current concerns / barriers though in moving to that state; like cost, privacy concerns, maturity of alternate technologies etc.
For an authentication factor to de-throne password from its status as the default primary authentication factor, firstly it should at the minimal not be forgettable , social-engineerable or phish-able.
The industry has started to respond in addressing these requirements with development of alternate solutions. One of the front runner in this is the Fast Identity Online (FIDO) alliance. The FIDO alliance is a cross-industry consortium and have member companies including some heavy weights like Google, Microsoft, Intel, Samsung, Lenovo, Alibaba Group, PayPal, NTT Docomo, American Express, Bank of America, Visa and MasterCard backing it. It seeks to eliminate the dependency on password as authentication means. FIDO recently announced government membership program and have enrolled two government bodies - Office of Cabinet (UK) & NSTIC/NIST (US).
FIDO alliance has proposed two open specifications -
1. Universal Second Factor(U2F)
augment a first factor (eg: usually password but is not mandated to be so) with any token that is compliant with U2F standard
2. Unified Authentication Framework(UAF)
that aims at replacing passwords completely. This does this by having a de-coupled user verification happening locally on the device using a component called FIDO Authenticator and authentication with the service provider using Public Key cryptography.
With the advent of newer Smart phones / Tablets / Laptops (eg: Iphone 6, Samsung Galaxy S6/Note 5, Nexus 5X/6P, Lenovo laptops etc) built-in with features to obtain biometric factors like Fingerprints, Face-recognition or Voice-print etc; biometric factors could be used in lieu of password for authentication. This can significantly reduce the Total Cost of Ownership (TCO) as the mode to provide the authentication is already built in on the user device. For the user, this is more convenient than remembering a password. A win-win situation.
Storing my biometrics on a remote server? No way..
There is this lingering question when using biometric authentication regarding the privacy impact of storing biometric information on a centralised remote location. With FIDO's UAF based authentication, even if we were to use biometric based authentication using our smartphone or laptops with fingerprint scanner, the biometric information never leaves the device. The biometric pattern based on the user's biometric information and public / private keys generated specific to the combination of the user+device +service are stored securely within the device. The public key is sent securely to the service provider as part of registration and is used during the authentication process to validate and provide the authentication response to the service provider securely (signed using the private key).
As illustrated above, there is a clear separation of user verification and the FIDO Authentication with the service provider. This means that the user can use any authenticator that the device would be able to support as long as it is compliant with the service provider's policy on acceptable authenticators. You can probably call this " BYOAuth
". So, for the same service provider, one user might use the authenticator that does fingerprint based verification while another user might be using another authenticator that does NFC enabled Yubikey (like Yubikey Neo). This also makes easy to future-proof against any new form of verification methods.
Hmm..Interesting, has it seen the sun or is it still on papers and prototypes?
Not at all.. Its very much live and kicking.. FIDO's UAF based authentication is gaining traction (seem more so on the consumer front) albeit slowly.
Paypal ( https://www.paypal-pages.com/samsunggalaxys5/us/index.html
) and AliPay both support using FIDO's UAF based authentication.
NTT-Docomo's OpenID (DocomoID) supports UAF based authentication ( http://www.nfcworld.com/2015/05/26/335459/ntt-docomo-rolls-out-fido-biometrics-platform-in-japan/
).
For more information on the FIDO’s specifications, please visit https://fidoalliance.org/specifications/overview/
Apart from the FIDO UAF compliant authentication, there are other biometric authentication solutions that also seeks to eliminate passwords like the Fujitsu's IRIS based authentication
( http://www.fujitsu.com/global/about/resources/news/press-releases/2015/0302-03.html
) and behavioural based authentication like Lockheed Martin's gesture based authentication solution developed for NSA called Mandrake
( http://www.engadget.com/2015/05/26/nsa-tests-finger-swipe-identification/
).
In closing..
Replacing passwords with stronger authentication factors like biometrics is not a silver bullet, but it does provide a higher level of assurance and significantly reduces the possibility of a breach. For High risk / Sensitive transactions FIDO based authentication augmented by contextual authentication (using geolocation, device fingerprint, user profiling etc) could be employed to achieve higher ability to prevent fraud.
On the original question of "Can the password be de-Throned?", we are not there yet but are much closer to it and moving towards with a steady pace..
#KillPasswords #FIDO

Wishing all a very Happy new Year !!!
Looking back into 2015, we have had a year with one of the highest number of password breaches in the decade so much so that not even password managers were spared...
I am not worried.. My passwords are very complex..
Much has been talked around the wider security circle on the importance of having strong & complex passwords. A good authentication system should allow access to protected resource to those who are authorised while keeping those who are not authorised away. Both these aspect are equally important. If implementing an authentication system that allowed an unauthorised person to access protected resource is a failure, it is equally a failure if the very same system puts off genuine users by virtue of having highly complicated and restrictive authentication requirements.There needs to be a perfect balance between security and usability.
Password based authentication is the most used since it also is the easiest to implement. For most part of the last decade, the security fraternity has been emphasising users to use complex passwords that are hard to guess. Add to the fact that on an average we use well over 20 passwords. With a convoluted passwords comes the issue of users forgetting these passwords, re-using it across many systems and in some cases writing it down on sticky notes (physical or virtual). They are very hard to type in on mobile devices (particularly if you have a complex one).
Talking about password re-use, there is growing number of enterprise users who re-use passwords between their personal identities (like Facebook, LinkedIn or even a not-so-secure website that gave them a free usb stick for registering at a trade show) and corporate identities leave alone it being complex enough or not. If any of these websites are compromised, with a bit of social re-engineering, the user's corporate access could also be compromised.
What was appropriate as a excellent way to authenticate users a decade ago may not be safe enough in the current climate. Hackers empowered with computers with extremely high processing power working in tandem on P2P systems make even a complex password to be cracked in a jiffy with a fairly high success rate. Passwords have an inherent risk that they can be compromised by Phishing, Key-logger or Malware. All these make password one of the weakest forms of authentication scoring low on security as well as usability.
How about if i use something along with the passwords to make it safer?
From the discussions above,we could understand that relying on password as the sole means to protect access to resources is not secure enough in today's climate. While not having to use password as a factor at all would be the perfect situation, a good place to start would be to use the security provided by passwords further enhanced by adding additional layers of authentication factors. This will involve the user to provide two or more factors of authentication which would be a combination of:
- something the user know (Password, Pin etc.),
- something the user has (Smartcard, KeyFob, Soft-Token, YubiKey, X.509 certificate etc.) and/or
- something the user is (Fingerprint, Iris recognition, Voice pattern etc.).
This stronger authentication is known as Multi-factor Authentication. The general guidelines is to have a mixture of the factors belonging to different types i.e Password + Keyfob.
The purpose of authentication is to arrive with a level of assurance that the person authenticating is who he claims to be. Making use of multi-factor can increase the score of how assured you are about the person authenticating. That said, each type of authentication factor can provide a different level of assurance. For example, proving using " What you have
" is usually more assuring than proving using " What you know
" and similarly, " What you are
" provides a higher level of assurance than the other factors.
This can lead us to an interesting use of multi-factor where a particular authentication factor could be utilised based on a risk-based approach. So, if a HR executive is accessing the corporate HR application from within the corporate network, he might need to provide only his password to authenticate, but if he is accessing the same application from a coffee shop down the road, the risk profile is higher, hence he might be expected to provide a OTP from the keyfob or a Soft-Token and if he is changing any personnel records while being logged- in outside the corporate network, he may need to provide additional
authentication factors like Fingerprint as the risk profile has increased further. This is known as " Step-Up
" Authentication.
The issue of users forgetting password due to having to remember many passwords is addressed to a great extent by using password managers. Enterprises address this using Single-SignOn (SSO) solutions. But this does not address the weakness of password as something that can compromised by Phishing, Malware or the fact that with cracking of passwords is made easier in the current climate of co-ordinated attempts using P2P etc.
Sounds great..but can't we do anything about the password element in the above?
Yes.There are solutions emerging in this space.. In my next post i will be covering around how we could move towards eliminating the need for using password...
#KillPasswords