By Peter Reeves and Aiden Gallagher
Introduction
When developing, administering, and managing an API Connect topology, it is necessary to have a strategy for the use of environment-specific configuration.
API Connect has strong user-journeys and capabilities for the management of keys and certificates through the TLS Profile configuration objects, and for non-confidential environment-specific strings and settings, using the Catalog Properties feature or the API Properties features (for API-scoped configuration).
However, there can often be a need for protected data to be added to the API definition, such as usernames/passwords, basic-auth credentials, api-keys or other confidential strings. It is important to protect these from general access.
This article describes available methods for the storage and usage of confidential strings (usernames, passwords, and API keys etc.,) to be used within API assemblies. The article will also review the benefits and the constraints of these methods.
Table of Contents
Introduction
Accessing Credentials in APIs
A common mistake: Leaking Credentials through the use of API Drafts
A common mistake: Misaligning view permissions in API Manager
A common mistake: Enabling CLI all access (even in pipelines)
Options for removing passwords from the API YAML
Using Catalog Properties
Injecting Passwords into Automation Pipelines
API Connect Custom Policies (User-defined or Global Policies)
Referencing DataPower Objects
Conclusion
Accessing Credentials in APIs
Initially in the API development process (and often for convenience while hacking a working prototype API together) it can be tempting to inline or hardcode passwords into the API YAML. However, this means that any user with access to the API YAML can see this in clear text posing a potential security risk of access to downstream services.
These API YAMLs can be accessed in three areas:
- In the API YAML’s source location i.e., GitHub, GitLab, Azure Devops, SVN etc.
- In the API Drafts view, which is accessible in the API Manager Web User Interface
- Directly from the API Manager using the API Toolkit Command Line Interface.
APIs themselves can also reference other objects to remove the need for passwords and other credentials to be inlined or hardcoded in the YAML. This includes:
- API Manager Properties i.e., Catalog Properties
- IBM DataPower; objects are already accessible in the API Assembly
- External services to pull credentials directly from, for example, a password Vault like HashiCorp of CyberArk which typically have APIs that can be called to extract passwords either at API deploy time or during the API runtime
Note: that side calls to password Vaults performed inside API assemblies / runtimes is cumbersome and likely to impact response times.
There are also different types of credentials which may have different access permissions. For example, common credentials for API Connect to connect to IBM ACE might use one set of credentials, whereas other services might expect specific users depending on the API.
In addition to varying and making permissions granular through the use of varied credentials, which users should be able to read and write these passwords might be different; perhaps an API Connect infrastructure team, API owners and developers themselves or possibly even no-one.
A common mistake: Leaking Credentials through the use of API Drafts
Confidential credentials shouldn’t be stored in source locations or directly in API YAMLs that can be directly accessed by general users. It is a common practice to make use of API Drafts to develop APIs in the API Manager UI, and it is also regularly the case that the API Manager UI is used to deploy APIs in all environments.
The use of API Drafts still means APIs are stored somewhere with plain text passwords. This is not recommended outside of development, and even in development should not be a long-term store of APIs. These APIs should be deleted after successful deployment and testing.
A common mistake: Misaligning view permissions in API Manager
Often, relying on locking down access to the API Manager UI is the only way to control who can see certain objects, but administration users and operations users must have access to credentials at some point, either to deploy the API or view it when managed within the API Manager.

Figure 1 - Permissions for objects in API Manager
For each permission and category, it is important to see what ‘view’ allows you to access. For example, ‘Settings’ gives the ability to view Catalog and Space Properties as well as user-defined properties within the gateway services. API Drafts ‘view’ allows all Drafts in the Provider Org to be viewed.
Whilst it is possible to use the out of the box roles to restrict this access, it is also possible to create custom roles which allow you to manage access more specifically.
A common mistake: Enabling CLI all access (even in pipelines)
The Developer Toolkit (sometimes known as the APIC Command Line Interface) depends on User Registry identities. User permissions are assigned to a realm, such as the cloud administration organization, provider org, a consumer org.
The CLI therefore has the ability to make changes to the administrative state of the API Connect Cloud, and to see and access and change client IDs, client secrets and even subscriptions. Restricting this access is therefore very important, whether through the user roles assigned or by restricting the commands that can be run.
Often the CLI is used with automation pipelines, this could instead use operating system permission controls to restrict which commands can be run. Alternatively, fixed scripts and pipelines which need approvals to update and run, could be implemented to limit usage of powerful commands. Linux / shell restrictions are not part of the API Connect product so would be a Linux administrator activity.
Options for removing passwords from the API YAML
Using Catalog Properties
It is possible to set variables as Catalog Properties, this could be used for storing credentials. This can then be referenced in the API, this also allows properties to be updated without redeploying the API as the value can be changed under the covers.
One concern with Catalog Properties is that they are visible in the Catalog. There is the option to base64 encode these values but this is not encryption and so relies on Catalog access permissions.

Figure 2 - Catalog Properties in API Connect
Encrypting the properties
It is possible to encrypt the properties before uploading then to the Catalog. An operations team member can encrypt the property string, and then add the ciphertext of the property to the catalog. This relies on the decryption key being provided to the API (usually by being directly added to DataPower, and then referenced inside a gatewayscript. This method of storing encrypted properties stores the credentials securely from view permissions in the Catalog, however this means every single API that uses these properties must decrypt them inside the API Assembly at runtime, which could have an impact on performance and response times.
This requires both the encryption and decryption to be created and maintained, as well as certificates stored securely in DataPower.
This option is for more advanced API Connect developers who are comfortable utilising the DataPower objects. The key benefit is that nobody can view the properties within API Connect without using the decryption function.
Injecting Passwords into Automation Pipelines

API Developers could make use of automation pipelines when deploying APIs and Products. This relies on having an external password store such as CyberArk, HashiCorp Vault, Azure Key Vault etc which the pipeline can read from at deploy-time.
The API YAML will be stored in the source repository with a parameterised credentials. This tells the pipeline which credentials to pull, and from where. During deployment, the password is injected into the API YAML (ephemerally) before being pushed into API Connect.
The key benefits to using the pipeline to publish Products and their APIs containing the credentials, is that the API YAML cannot be viewed from the API Manager (by using the API Manager CLI/API, the API yamls “skip” the Drafts area). Injecting of credentials and other environment-specific config into deployable artefacts via pipelines is an extremely common and well-known pattern, with most password stores having REST APIs.
Some considerations are that the API needs to be redeployed to make updates to the credentials and that the API YAML can still be accessed using the CLI, and therefore users are able to see the credentials in clear text.
API Connect Custom Policies (User-defined or Global Policies)
Custom Policies, whether User-defined policies or Global Policies, are policies that can be dragged and dropped (or referenced) into API Assemblies. Custom policies are easily reusable pieces of API Connect processing, that can be used as a way of injecting credentials into APIs without storing them in clear text in the API YAML.
Instead, credentials are stored in the custom policies, with credentials in the policy objects, and they are therefore not accessible when viewing any object in the API Manager or the CLI.
The concern with this approach is that it requires the policy, and therefore the credentials, to be stored somewhere such as a source repository. Whilst this method does not require an API redeployment, it does still require updates to the policy itself which can be an invasive activity impacting potentially multiple APIs at once.
This method does have benefits if a singular team, such as an API Connect infrastructure team, is managing credentials. This allows that team to control the credentials without developers being able to view them.
https://www.ibm.com/docs/en/api-connect/10.0.x?topic=applications-working-global-policies
https://www.ibm.com/docs/en/api-connect/10.0.x?topic=constructs-user-defined-policies
Referencing DataPower Objects

Developers or API Connect administrators can store credentials in DataPower, which can then be referenced within APIs using out-of-the-box functionality. One example might be the use of crypto objects to store certificates or reading from a local DataPower file for credentials.
Whilst this abstracts the viewing of the credentials within the API Manager; UI or CLI, it still requires the credentials to be stored somewhere, in this case in a secure gateway which in some ways poses more risk than exposing them in the API Manager.
There needs to be a method to push and manage objects, and in some cases provide access to developers for debugging issue. It requires knowledge and skills of DataPower to implement.
Conclusion
Whilst all these options are workable, and the benefits and constraints are detailed, it is important to understand that the decisions made will have some impacts on existing systems and processes.
Therefore, you should analyse the options and its impact in your organisation. Consider:
- What skills you have in-house e.g., having to learn DataPower custom code, or login to the box
- How responsibilities and teams are structured and allocated in your organisation. How will the option you are analysing impact the various teams involved, does it require giving additional permission or roles to teams to fulfil a single use case
- Avoiding additional complexity e.g., creating a pipeline for one API use case
- Requirement for additional components and their costs e.g., an external to API Connect password store
- Security implications and the securite rating of the APIs. For example, do you want to expose access for storing credentials in DataPower