Tyk News

All the latest news & updates on the Tyk API Management Platform

7 Critical Factors For Selecting Your API Management Layer

Remember the final scene of Indiana Jones and the Last Crusade, where the old Knight says: “You must choose, but choose wisely”? Admittedly Indy wasn’t choosing an API Management Solution and certainly had other things on his plate that day. However the adage still applies and to help you choose wisely we’ve got James Higginbotham, who has put together a list of the seven critical factors to consider when choosing your API Management Layer.


I’m often asked which API management layer is the best one available today. The answer is always, “It depends”. Whether you are considering an open source or closed source API management layer, the number of vendors and options available today are astounding. Many API management solutions focus on delivering specific capabilities, while others strive to cover a breadth of features but don’t go very deep in all areas. This article will shed some light on how to approach the decision making process for managing your API, so that you can ensure the needs of your business, product, and development teams are met.

Why Do You Need API Management?

For those unfamiliar, API management layers accelerate the deployment, monitoring, security, versioning, and sharing of APIs. They are often deployed as a reverse proxy, intercepting all incoming API request traffic and applying rules to determine if requests should be routed to the API. In addition to traffic management, they commonly offer:

  • Token-based authorization support through API-key based authentication and/or OAuth 2
  • Deployment and versioning support for redirecting incoming requests to the current or newly deployed release of an API
  • Rate limiting to reduce the impact of greedy API clients and denial of service (DoS) attacks
  • Developer portals for hosted documentation and self-onboarding by developers
  • Administrative portals for viewing usage reports
  • Billing and payment support for selling subscription-based access to your API
  • On-premise, cloud, and hybrid hosting deployment options

API management layers may be offered as purely closed source, purely open source, or in a hybrid model using a combination of open source components and closed source offerings.

Factor #1: Self-hosted and SaaS deployment options

Your deployment requirements are a huge factor in API management layer selection. While most vendors offer managed cloud-based options, some choose to do so only during the early stages of your API, requiring you to move to an on-premise solution as your traffic increases. Knowing how you need to deploy your API management layer, including the resources available to monitor and maintain it, is important to the selection process. Look for a vendor that offers the kind of deployment you require: on-premise or managed cloud services. If you are unsure, select a vendor that offers a seamless transition from one to the other, such as Tyk.io.

Factor #2: Simple installation process

If your API management layer will reside within your own cloud environment or data center rather than hosted, then installation needs to be simple. Evaluate the installation process to ensure that standing up new instances and new environments (e.g. staging, UAT, integration) will be easy – and preferably automated. If you prefer containerization, consider vendors that offer a container-based distribution to reduce the effort required to support your deployment process.

Factor #3: Meets feature requirements

Part of your selection process should include an evaluation. We covered this in a previous article, but I’ll repeat it here for reference. Your evaluation should include the following considerations:

  • Authorization – can you implement your desired authorization mechanism (e.g. API tokens, keys, OAuth 2, etc) to meet your needs?
  • Performance – how much overhead does the layer require for each request? Measure the performance of your API endpoints before and after installing the API management layer. Expect some reduction in performance, but also ensure that the management layer doesn’t cause a drastic decrease in performance that may require additional server capacity
  • Security – perform some basic penetration testing to verify that the layer is catching common attack vectors. Attacks such as SQL injection, denial of service attack prevention through rate limiting, and other attacks can often be simulated with some simple scripts
  • Onboarding – how easy or hard will it be for your developers to get onboarded? Does the onboarding process support the business, product, and technical needs of your company?
  • Reporting – does the management layer provide the information you will need on a day-to-day basis to better serve your developers? Can you export data via an API or push it into an external reporting solution easily, for integration into other daily/weekly reports?

Factor #4: Customization should not required

I was recently discussing the abundance of infrastructure tools available to development teams today. With every tool comes the burden of understanding it and getting it integrated into your environment. Some tools choose to offer a variety of options, but require considerable effort to get them running. Be sure to evaluate the effort required to start using the API management layer. Customization options are great, but if you can’t get started easily or without installing lots of plugins, you need to know this ahead of time.

Factor #5: Easy upgrades

Whatever solution you select, you will need to keep it upgraded to ensure you have the latest improvements and available features. Evaluate the upgrade process by reading past release notes to better understand the process that will likely be required. If there are no release or upgrade notes, then that should generate a concern. Just keep in mind that some commercial offerings only supply these details directly to customers or via a customer portal. If you don’t find anything, contact the vendor to ensure that they are available to paying customers.

Factor #6: Vendor viability

We all want API management vendors to experience growth and success. However, not everyone will be around in the long term. Consider the vendor’s viability by understanding their revenue model. For open source solutions, take into consideration the companies backing the solution, along with the community that is supporting it. If there isn’t much activity, then the solution may become abandoned in the future.

Factor #7: Management Automation

Finally, consider the automation options available to configure, manage, and integrate the solution into your operations processes. Vendors that offer APIs for every feature available in their configuration APIs, along with reporting APIs and webhooks for important events ensure that you can easily automate changes and integrate it into your deployment process.


As you have likely realized, it isn’t easy to select an API management layer. However, your decision will have ramifications for months or years to come. It may offer tremendous flexibility or severely limit your options in the future. Take the time to properly evaluate the API management layer that best fits your needs.

Why Open Source is right for your business

This month we feature a guest post by the editor of the excellent API Developer Weekly: James Higginbotham who talks about the benefits of Open Source API Management for your organisation – Take it away James!


Over the last 5 years, we have seen tremendous growth and options available for API management. While some are closed source, many vendors, such as Tyk.io, are choosing to launch as open source API management solutions. Companies are now asking the question, “Is an open source API management layer the right choice?” Let’s examine some of the advantages of open source API management, to help us through the decision-making process.

Advantage #1: Avoid the “DIY” API Management Solution

I have spoken to some groups that have rolled their own API management solution. While your team may be the unique snowflake that needs to build your own API management layer, doing so requires considerable time, resources, and expertise. Instead, start from an open source API management layer.

Dave Koston, VP Engineering for Help.com, agrees: “There’s simply no way we could internally build the feature set of many of the OSS products we use as it would take 10-20 times longer than learning their product and the cost would be many times higher as well.”

A good open source API management layer should offer ways to customize the solution, either via clearly defined APIs or plugin-architecture. Your focus should be on delivering value to the market, not becoming experts in API management.

Advantage #2: Code Reviews Create Confidence

Open source solutions allow the API provider to perform a code and security review – perhaps pairing someone from the API provider with an engineering resource from the vendor. However, Mr. Koston recommends caution when factoring code reviews into your OSS selection: “We reviewed other solutions which were wrapped into other web servers like nginx but having multiple levels of software inside the gateway made it hard to determine where problems arose. Being able to simply read the source of a single product and talk to a single vendor makes the product, and any issues much easier to reason about and deal with.”

Keep in mind that your API management priorities may not be the priorities of some vendors. API management layers must offer a breadth of features. Not every vendor will focus on the ones most important to you. Being confident in the code that is protecting your APIs is important.

Advantage #3: Jumpstarts Your API Management Early

API monitoring and security should start early, not after experiencing growth. Too often, I have seen companies deploy without an API management solution, only to realize that they have no insights into how it is consumed, who is consuming it, and if any security compromises have occurred. The most often cited reason is due to one of limited time, limited/no budget, or uncertainty if the API program will succeed. Once the API program experiences growth, the impact of installing an API management layer is much greater and can have a negative impact on existing API consumers due to changes in account and API token management. Open source API management layers make this an easy and affordable option, even if your API is only used internally or to power web and mobile apps.

What About Technical Support?

When adopting an open source development tool, technical support may vary from Github tickets to mailing lists and Slack groups. However, choosing an open source API management layer doesn’t mean you have to go without vendor support. Many vendors, including Tyk.io, and others offer technical support packages that address the needs of the enterprise, mid-size companies, and growing startups. Be sure to evaluate how your API management layer will be supported long-term as part of your assessment.

Getting Started

Many open source vendors offer distributions of their API management layer that are easy-to-install on a laptop, on-premise, or in the cloud. Start by building a prototype API that mimics your needs, then try out each API management layer to make sure it meets your needs. Your evaluation should include the following considerations:

Authorization – can you implement your desired authorization mechanism (e.g. API tokens, keys, OAuth 2, etc) to meet your needs?

Performance – how much overhead does the layer require for each request? Measure the performance of your API endpoints before and after installing the API management layer. Expect some reduction in performance, but also ensure that the management layer doesn’t cause a drastic decrease in performance that may require additional server capacity

Security – perform some basic penetration testing to verify that the layer is catching common attack vectors. Attacks such as SQL injection, denial of service attack prevention through rate limiting, and other attacks can often be simulated with some simple scripts

Onboarding – how easy or hard will it be for your developers to get onboarded? Does the onboarding process support the business, product, and technical needs of your company?

Reporting – does the management layer provide the information you will need on a day-to-day basis to better serve your developers? Can you export data via an API or push it into an external reporting solution easily, for integration into other daily/weekly reports?

Part of any API program’s responsibility is to select a great API management layer. Make the time to do a proper evaluation to ensure that the one you select will meet the needs of your company.

Tyk joins the Open API Initiative!

Oh frabjous day! This is exciting stuff…

For those of you who don’t know what the OAI is, here’s what they have to say about what APIs represent:

APIs form the connecting glue between modern applications. Nearly every application uses APIs to connect with corporate data sources, third party data services or other applications. Creating an open description format for API services that is vendor neutral, portable and open is critical to accelerating the vision of a truly connected world.

Now, everyone together – what’s the Tyk motto?

Wait, you don’t know it? Shame on you. It’s:

“Connect every system in the world”

Sounds awfully similar amirite?

Bottom line: our goals here at Tyk and the goals of the Open API initiative are very well aligned, as an API Management platform, a lot of the things we end up doing is reading, displaying and interacting with the OpenAPI spec, so it’s very important to us, and our users that we know what’s going on.

We work with a lot of big companies, and we work with a lot of small, fun startups that like to really push the boat out, and every time we see someone talking about another standard, we see the problems caused by the various vendored implementatons (and we’re a vendor!), and by golly, we don’t want to be part of the problem, we want to be part of the solution. Standards make things better – we like standards – and we’re painfully aware of what another standard means to the world of API Description languages:

enter image description here

So we are proud to announce that Tyk is now an official member of the Open API Initiative (OAI) – A Linux foundation collaborative project that is well placed to reduce, rather than proliferate, standards.

The OAI aims to create an open source, technical community where industry participants may easily contribute to and adopt the project’s technology and focus on creating, evolving and promoting a vendor neutral description format currently known as the Swagger Specification.

We are all looking forward to contributing to this project and being a part of reaching our goal.

Tyk 2.2 is Here!

That’s right, the long awaited version 2.2 release is finally here, we’ve been cleaning up, adding features and most importantly, making Tyk easier to use.

The Gateway

Now features some pretty cool new features:

  • OAuth now has support for the client credentials flow
  • It is now possible to use generative security policies with OAuth clients, that means no more needing to send any session data back to Tyk!
  • XML support for inbound and outbound body transforms
  • Context Variables: Get quick and easy access to request data across a variety of middleware, data such as requester IP, form data, headers, path parts and other meta data, all available to the transform middleware
  • Partitioned Policies: Now you can have policies only affect one part of a token instead of all elements, for those who have very complex sales strategies or quota allocation strategies, this makes it much easier to manage. So instead of having to grant ACL, Quota and Rate Limit rules in one go, you can just grant ACL rules, or Quota rules, or both.
  • Normalised URLs in analytics: With this latest update you can normalise the URLs in your analytics to remove those pesky UUIDs and Numeric IDs for more meaningful data


We’ve left this for last, because we’re pretty happy about this, Tyk now transparently supports Websocket proxying as part of your APIs, and these can be protected by the same mechanisms that currently protect your existing APIs, be that Bearer tokens, our Circuit Breaker, or our Load Balancer, the websocket proxy is transparent and will “just work”.

The Dashboard

The dashboard now has a UI for all of the above features, but the biggest change you will find is our new i18n support. That’s right, Tyk Dashboard now has language-pack capability.

To get things started up, we’ve translated the UI into Simplified Chinese and Korean, and have made the language packs available as an open source repository here.

That’s right, it means that it is easy and dynamic to add a new language to Tyk (or to change the wording of the UI if it doesn’t suit you). All of this is configurable and easy to deploy, so have at it.

This update has been a small one for us, because we’re trying to make smaller, more effective releases that help our community and users instead of breaking things. We’ve got some pretty major features in the pipeline coming up, but our real focus will be on stabilising the platform.

As always, Tyk is available on our apt, yum and docker repositories!


Martin & The Tyk Team.

Integrate Tyk with Auth0

There’s been a lot of community chatter about this, and a lot of back-and forth trying to get Auth0 to play nice with Tyk’s low-level JWT handling.

Well, now you can chill, because Auth0 integration is now easy as pie!

Let’s get things ready:

  1. In your Auth0 Application, under OAuth Settings, click “Show Advanced Settings”
  2. Select the “OAuth” tab
  3. Make sure “JsonWebToken Token Signature Algorithm” is set to RS256
  4. Save it

Tyk OpenID Connect Support with RSA Keys

We’re going to simulate a login here, but in a real app, as part of your OAuth flow, you will need to add scope=openid to your authorize request in order to get the OpenID ID Token.

To simulate a login (I assume you have a test user), browse to your “Users” section and click “Sign In As User” and select your App. In order to get the ID Token, you must use the “Client Side App” option.

Tyk Auth0 OpenID Connect Support

When you click this, you’ll be taken to a login page provided by Auth0, you don’t actually need to log in, Auth0 will do that for you. But in the address bar, you will see a query-string parameter called “id_token”, this is your OpenID ID token you are going to try and use with Tyk, copy that token to a file somewhere safe.

Now that we have a token we can play with, we can use it with an API we are proxying with Tyk.

Setting up Auth0 with Tyk

  1. Create a new API, let’s call it auth0
  2. Select “OpenID Connect” as your Authentication mode
  3. Save it.

OIDC with Tyk is a little chicken-and-egg, because we need to apply a set of access rules to users coming in via different clients, so we actually need to create a policy before adding the rules to your API Definition. So now that we’ve saved the API: go create a policy that grants access to it.

Now, back to your Auth0 API Definition:

  1. Add your Auth0 URL as the provider (e.g. https://tyk-test.eu.auth0.com/, each one is different) – in the field that has accounts.google.com as a sample text, and click “Add”
  2. When the new issuer is added, get your Auth0 App’s Client ID and add it as an approved Client ID, and under policies, select the policy you just created.
  3. Click “Add” – you’ll see the policy add to the table
  4. Save the API

Ok, you’re all set – now all you need to do is craft a request to your API using the Authorization: Bearer {id-token} header. You’ll see that the token will get through, and the rate limits / policies will be applied for the user that was requested.

Easy as pie 🙂

Tyk v2.1 is out – Now with Open ID Connect, bug fixes and more!

Recently we announced that we had added full support for Open ID Connect to our Cloud platform, and that we were moving it to our next release in due course.

Well, the wait is over – and as of today it is available to everyone! That’s right, Tyk v2.1 and Tyk Dashboard v1.1 are now available.

This release’s main feature is the OIDC support, however we have also made many improvements and bug fixes, all of which can be seen in the Change Log.

To get started with 2.1 you can just upgrade your existing installations, all the deployment methods are supported and v2.1 will be the default installation for all major distribution methods, but before you go off and do that, please back-up your configuration files!

This is our first attempt at making smaller, more regular releases to ensure that upgrades are easier and involve less risk for you, we dog-food every feature and change in our cloud platform before we cut a version for on-premise installation, so you can be sure that we’ve put all our builds through their paces.


Martin & The Tyk Team

OpenID Connect Support in Tyk Cloud is Here!

OpenID Connect support just went live on Tyk Cloud!

So let’s talk about how openID connect support works with Tyk – cause it’s pretty cool.

You can now take JSON Web Tokens generated by OpenID Connect-compatible Identity Provider (id_tokens, in OIDC parlance) and point them at your Tyk-Cloud-Managed API, Tyk will then jump through hoops to make your life easier:

First, we validate the token:

  1. Is the token a valid jwt?
  2. Is the token issued by a known OP?
  3. Is the token issued for a known client?
  4. Is the token valid at the time (‘not use before’ and ‘expire at’ claims)?
  5. Is the token signed accordingly?

Then, we apply some rules:

  1. For this client ID, is there an associated token policy?
  2. Is there an underlying identity (the user ID of the bearer of the token)
  3. Generate an internal representation of that user, so they can be identified across JWT’s and Clients
  4. Apply the policy template to that identity (that’s your access control, throttling and quota’s)
  5. Generate some useful meta-data for your analytics
  6. Let the request go on

You can, if you are so inclined, even have the bearer rate-limited differently depending on their source, so if they came from your free client, then they get low access, but if they use your enterprise version, they get super-fueled access. It’s as easy as flipping a switch in your API configuration.

What does this mean?

It means that you do not need to integrate with Tyk at all, or even have Tyk generate tokens for you, token generation and control can rest entirely with your IDP’s using the OIDC standard, and point them at your Tyk Cloud instance. All you need to do is decide which issuers, and which of their registered clients to allow through, and set which policies and rules to set for those clients.

That means Mitre, Google+, Auth0 and any other Single-Sign-On provider that can handle Open ID Connect tokens is now compatible with Tyk Cloud.

But Wait, I’m an on-prem user! I want OIDC Too!

Well, you won’t have to wait long – we’re going to be pushing a release very soon with this feature because we think it’s so awesome. If you are extremely impatient, it will be live in our nightlies very shortly.

What could be better? What needs to be added? Tell us, we can take it!

At Tyk, we are committed to helping people connect systems. We have some pretty great ideas on how that should happen. Tens of thousands of Tyk users agree. Maybe even you?

When it comes to our roadmap, we listen to the thousands of open source users, Pro license holders and cloud subscribers. We listen, build and release. Always getting better. Fast.

We believe we have the most transparent roadmap in the industry, its a trello board, it sits here.

We would love you to contribute to this, let us know what your priorities are. You can feed into it by contacting us through the forum, github, gitter or email.

So this post is aimed at you. We know you have an opinion, we know you think there is a better way of doing things. That’s why you use Tyk and not “Boring McBoringface’s Monolith API Stack” So whether you have Community, Pro, Enterprise or Cloud edition – tell us what you want from Tyk.

Integrating Tyk Open Source API Gateway with a Custom Identity Provider using JSON Web Tokens

That’s quite a mouthful. But hey, you know what a lot of users want to do? Use their own Identity Provider, and the new hotness is JSON Web Tokens. For those who don’t know what they are, they’re pretty friggin’ cool – find out all about ’em here. We’re going to use them to do some cool trickery/magicky API Gateway token gen without even having to generate a token


But seriously, it’s pretty cool – in short: you can have a custom Identity Provider (IDP) generate JSON Web Tokens for you, and then use those tokens directly with Tyk, instantly – better yet, the underlying identity of the token (it’s owner) is tracked across JWTs, that means they can log in elsewhere, or via a different app or portal and still get the same rate limits provided (or different ones, it’s all up to your IDP, not us!).

So how does Tyk Magically do this? Well, the process is pretty involved, so we thought we’d talk you through it:

Centralised secrets and virtual tokens

With centralised secrets, we do not create any tokens at all within Tyk, instead all tokens are generated externally. Centralised secrets involves storing a shared secret (HMAC or RSA) at the API Definition level, this secret is then used to validate all inbound requests, and applies access rules based on specific fields that can be added to the claims section of the JWT to manage the underlying identity’s access to managed APIs.

To use this option, we do not generate a token at all, instead we go to the API Designer, and under the JWT Shared secret section (when selecting the JWT security handler), we add the shared secret (recommended is a public key)

First, let’s set things up:

  1. In the API Designer, we select “JSON web Token” as the authentication mode
  2. Select RSA as the signing method
  3. Add a secret (public key)
  4. Set identity source to be “sub” – this tells tyk which claim in the JWT to use as the base “identity” of the JWT, i.e. the bearer (this might be a username, or an email address, or even a user ID), a common JWT claim is “email”, we could use that too
  5. Set the policy field name to be “policy” – this tells tyk which claim in the JWT to use to identify the policy to apply to this identity, as in: the access control, quota and rate limiting settings to apply to this identity
  6. Save this thing, it will now go live onto your gateway

Now lets create an actual policy:

  1. In our Policies section, we create a new policy called “JWT Test”
  2. Set the quota and rate limit rules, most importantly, we grant access to the API we just created

Let’s say when we save this policy, the ID returned is 1234567

So, let’s walk through a user flow:

  1. A user goes to a third-party login portal and authenticates themselves
  2. This third-party system generates a JWT using it’s private key, and in this JWT adds the following claims:
    • policy: 1234567
    • sub: the user’s email address
  3. The user is then granted this JWT (maybe as a fragment in the URL) and uses it to access the API we created earlier
  4. Access is magically granted!

Tyk will now validate this request as follows:

  1. It extracts the JWT from the request and validates it against the stored JWT that we added at the API level in step 1.2
  2. If the token is valid, it looks for the identity source field, since this is configured as “sub”, it finds the user’s email address
  3. It uses the email address to generate a hash of the address, and then creates a new value of {org-id}{hash(sub)} – basically it generates an internal token that is bound to the identity (it will be regenerated each time this sub shows up)
  4. Tyk extracts the policy ID from the claims and retrieves this from memory (if it exists)
  5. Tyk now tries to find this internal token hash in it’s key store – if it exists, it applies the policy to the key (this does not override existing values, it just sets the maximums so that they have an immediate effect if changed), access control rules are overridden too, so they can be changed depending on the access source or IDP doing the logging in
  6. If the internal token does not exist, Tyk creates the token based on the policy data – same as earlier but with a new identity-based token
  7. When the token is created, the internal token ID is added to the metadata of the session as a new field: TykJWTSessionID (This is important, because now you can reference this meta data object in the transforms middleware, for example, you could inject it as a custom header into the outbound request, so your upstream application has access to the original JWT and the internal Tyk Session token in case it needs to invalidate or track a specific user’s behaviour – aren’t we gosh darn helpful). Now since these session IDs do not change, they exist across JWT’s for this identity
  8. This internal token is now used to represent the request throughout the Tyk request chain – so now the access control, rate limiting and quota middleware will act on this token as if it were a real key – but it leaves the actual JWT intact in the request so it can be processed upstream
  9. If the access rules set by the policy referenced in the JWT’s policy field are all valid, the request is allowed to pass through upstream

Phew! That’s a lot of steps, but it means that you can handle access control in your IDP with a single added claim.

Anyway, we thought it was interesting – we hope you did too!

Simpler usage tracking with Token Aliases in Tyk Cloud!

So you might have noticed that this week we needed to do a little tinkering with our servers – thanks for your patience while we sorted all that out. We’re growing so quickly that we’re constantly pushing new features. One that went out today that we’re particularly happy with is Token Aliases.

What the heck is token Alias?

As you might know, Tyk will hash all keys when they get created so that they are obfuscated should your DB be breached, this creates a unique problem – how do you identify the tokens in your logs? That’s what Aliases aim to solve.

An Alias can be set on any token, and when the token access your APIs, the alias will be stored alongside the token ID in your analytics and displayed in your dashboard.

What’s more, when a developer generates a token via your API Developer portal, Tyk will auto-assign their email address as the alias to their token so that you can track their activity more easily.

(P.S. If you’re an on-prem user, this feature will be available in the nightlies tomorrow!)

Really simple, and really useful – enjoy Tykers!

Scroll to top