CloudHub is MuleSoft’s integration platform as a service (iPaaS) that
enables the deployment and management of integration solutions in the
cloud. Runtime Manager, CloudHub’s management tool, provides an
integrated set of logging tools that allow support and operations staff
to monitor and troubleshoot application logs of deployed applications.
Currently, application log entries are kept for 30 days or until they reach a max size of 100 MB. Often we are required to keep these logs for greater periods of time for auditing or archiving purposes. Overly chatty applications (applications that write log entries frequently) may find their logs only covering a few days restricting the troubleshooting window even further. Runtime Manager allows portal users to manually download log files via the browser, however no automated solution is provided out-of-the-box.
The good news is, the platform does provide both a command line tool and management API that we can leverage. Leaving the CLI to one side for now, the platform’s management API looks promising. Indeed, a search in Anypoint Exchange also yields a ready built CloudHub Connector we could leverage. However upon further investigation, the connector doesn’t meet all our requirements. The CloudHub Connector does not appear to support different business groups and environments so using it to download logs for applications deployed to non-default environments will not work (at least in the current version). The best approach will be to consume the management APIs provided by the Anypoint Platform directly. RAML definitions have been made available making consuming them within a mule flow very easy.
Solution overview
In this post we’ll develop a CloudHub application that is triggered periodically to loop through a collection of target applications, connect to the Anypoint Management APIs and fetch the current application log for each deployed instance. The downloaded logs will be compressed and sent to an Amazon S3 bucket for archiving.

Putting the solution together:
We start by grabbing the RAML for both the Anypoint Access Management API and the Anypoint Runtime Manager API and bring them into the project. The Access Management API provides the authentication and authorisation operations to login and obtain an access token needed in subsequent calls to the Runtime Manager API. The Runtime Manager API provides the operations to enumerate the deployed instances of an application and download the application log.
Download and add the RAML definitions to the project by extracting them into the ~/src/main/api folder.

To consume these APIs we’ll use the HTTP connector so we need to define some global configuration elements that make use of the RAML definitions we just imported.

Note: Referencing these directly from Exchange currently throws some RAML parsing errors.

So to avoid this, we download manually and reference our local copy of the RAML definition. Obviously we’ll need to update this as the API definition changes in the future.
To provide simple multi-value configuration support I have used a simple JSON structure to describe a collection of applications we need to iterate over.

Next, create our top level flow that is triggered periodically to read and parse our configuration setting into a collection that we can iterate over to download the application logs.


Next we add the sub-flows for consuming the Anypoint Platform APIs for each of the in-scope operations



In this last sub-flow, we perform an additional processing step of
compressing (zip) the log file before sending to our configured Amazon
S3 bucket.
The full configuration for the workflow can be found below
After running the solution for a day or so and checking the configured storage location we can confirm logs are being archived each day.
Known limitations:
Currently, application log entries are kept for 30 days or until they reach a max size of 100 MB. Often we are required to keep these logs for greater periods of time for auditing or archiving purposes. Overly chatty applications (applications that write log entries frequently) may find their logs only covering a few days restricting the troubleshooting window even further. Runtime Manager allows portal users to manually download log files via the browser, however no automated solution is provided out-of-the-box.
The good news is, the platform does provide both a command line tool and management API that we can leverage. Leaving the CLI to one side for now, the platform’s management API looks promising. Indeed, a search in Anypoint Exchange also yields a ready built CloudHub Connector we could leverage. However upon further investigation, the connector doesn’t meet all our requirements. The CloudHub Connector does not appear to support different business groups and environments so using it to download logs for applications deployed to non-default environments will not work (at least in the current version). The best approach will be to consume the management APIs provided by the Anypoint Platform directly. RAML definitions have been made available making consuming them within a mule flow very easy.
Solution overview
In this post we’ll develop a CloudHub application that is triggered periodically to loop through a collection of target applications, connect to the Anypoint Management APIs and fetch the current application log for each deployed instance. The downloaded logs will be compressed and sent to an Amazon S3 bucket for archiving.

Putting the solution together:
We start by grabbing the RAML for both the Anypoint Access Management API and the Anypoint Runtime Manager API and bring them into the project. The Access Management API provides the authentication and authorisation operations to login and obtain an access token needed in subsequent calls to the Runtime Manager API. The Runtime Manager API provides the operations to enumerate the deployed instances of an application and download the application log.
Download and add the RAML definitions to the project by extracting them into the ~/src/main/api folder.

To consume these APIs we’ll use the HTTP connector so we need to define some global configuration elements that make use of the RAML definitions we just imported.

Note: Referencing these directly from Exchange currently throws some RAML parsing errors.

So to avoid this, we download manually and reference our local copy of the RAML definition. Obviously we’ll need to update this as the API definition changes in the future.
To provide simple multi-value configuration support I have used a simple JSON structure to describe a collection of applications we need to iterate over.
{ |
"config": [{ |
"anypointApplication": "myDeployedApp-1", |
"anypointEnvironmentId": "<environment id gathered from Anypoint CLI>", |
"amazonS3Bucket": "<S3 bucket name>" |
}, |
{ |
"anypointApplication": "myDeployedApp-2", |
"anypointEnvironmentId": "<environment id gathered from Anypoint CLI>", |
"amazonS3Bucket": "<S3 bucket name>" |
}] |
} |
Our flow then reads in this config and transforms this into a HashMap that we can then iterate over.
Note: Environment IDs can be gathered using the Runtime Manager API or the Anypoint CLI

Next, create our top level flow that is triggered periodically to read and parse our configuration setting into a collection that we can iterate over to download the application logs.

<flow name="logArchiverFlow"> |
<poll doc:name="Poll"> |
<fixed-frequency-scheduler frequency="${polling.frequency.hours}" timeUnit="HOURS"/> |
<set-payload value="#['${log.achiver.config}']" mimeType="application/json" doc:name="Read config"/> |
</poll> |
<json:json-to-object-transformer returnClass="java.util.HashMap" doc:name="JSON to Object"/> |
<set-variable variableName="configCollection" value="#[payload.config]" doc:name="Set configCollection flowVar"/> |
<foreach collection="#[flowVars.configCollection]" counterVariableName="configCounter" doc:name="For Each item in Config"> |
<set-variable variableName="config" value="#[flowVars.configCollection[configCounter-1]]" doc:name="Set config flowVar"/> |
<logger message="#['Archiving log files for CloudHub application: "' + flowVars.config.anypointApplication + '" to Amazon S3 bucket: "' + flowVars.config.amazonS3Bucket + '"...']" level="INFO" doc:name="Logger"/> |
<flow-ref name="archiveLogFile" doc:name="archiveLogFile"/> |
</foreach> |
<catch-exception-strategy doc:name="Catch Exception Strategy"> |
<logger level="ERROR" doc:name="Logger"/> |
</catch-exception-strategy> |
</flow> |

<sub-flow name="archiveLogFile"> |
<flow-ref name="cloudhubLogin" doc:name="cloudhubLogin"/> |
<flow-ref name="cloudhubDeployments" doc:name="cloudhubDeployments"/> |
<foreach collection="#[flowVars.instances]" counterVariableName="instanceCounter" doc:name="For Each deployed instance"> |
<set-variable variableName="instanceId" value="#[flowVars.instances[flowVars.instanceCounter-1].instanceId]" doc:name="Set InstanceId flowVar"/> |
<flow-ref name="cloudhubLogFiles" doc:name="cloudhubLogFiles"/> |
</foreach> |
</sub-flow> |

<sub-flow name="cloudhubLogin"> |
<set-payload value="#['{ "username": "${anypoint.login.username}", "password": "${anypoint.login.password}"}']" mimeType="application/json" doc:name="Set Payload"/> |
<http:request config-ref="Access_Management_Config" path="/login" method="POST" doc:name="HTTP"/> |
<json:json-to-object-transformer doc:name="JSON to Object" returnClass="java.util.HashMap"/> |
<set-variable variableName="access_token" value="#[payload.access_token]" doc:name="Set Access_Token FlowVar"/> |
<logger level="DEBUG" doc:name="Logger"/> |
</sub-flow> |

<sub-flow name="cloudhubDeployments"> |
<set-payload value="{}" mimeType="application/json" doc:name="Set Payload"/> |
<http:request config-ref="CloudHub_Config" path="/v2/applications/{domain}/deployments" method="GET" doc:name="HTTP"> |
<http:request-builder> |
<http:uri-param paramName="domain" value="#[flowVars.config.anypointApplication]"/> |
<http:header headerName="X-ANYPNT-ENV-ID" value="#[flowVars.config.anypointEnvironmentId]"/> |
<http:header headerName="Authorization" value="#['Bearer ' + flowVars.access_token]"/> |
</http:request-builder> |
</http:request> |
<json:json-to-object-transformer doc:name="JSON to Object" returnClass="java.util.HashMap"/> |
<set-variable variableName="instances" value="#[payload.data[0].instances]" doc:name="Set Instances FlowVar"/> |
<logger level="DEBUG" doc:name="Logger"/> |
</sub-flow> |

<sub-flow name="cloudhubLogFiles"> |
<set-payload value="{}" mimeType="application/json" doc:name="Set Payload"/> |
<http:request config-ref="CloudHub_Config" path="/v2/applications/{domain}/instances/{instanceId}/log-file" method="GET" doc:name="HTTP"> |
<http:request-builder> |
<http:uri-param paramName="domain" value="#[flowVars.config.anypointApplication]"/> |
<http:uri-param paramName="instanceId" value="#[flowVars.instanceId]"/> |
<http:header headerName="X-ANYPNT-ENV-ID" value="#[flowVars.config.anypointEnvironmentId]"/> |
<http:header headerName="Authorization" value="#['Bearer ' + flowVars.access_token]"/> |
</http:request-builder> |
</http:request> |
<transformer ref="customZipTransformer" doc:name="ZIP before sending"/> |
<s3:create-object config-ref="Amazon_S3_Configuration" bucketName="#[flowVars.config.amazonS3Bucket]" key="#[flowVars.config.anypointApplication + '-' + flowVars.instanceId + '-' + server.dateTime + '.zip']" doc:name="Amazon S3"/> |
</sub-flow> |
The full configuration for the workflow can be found below
<?xml version="1.0" encoding="UTF-8"?> <mule xmlns:batch="http://www.mulesoft.org/schema/mule/batch" xmlns:s3="http://www.mulesoft.org/schema/mule/s3" xmlns:json="http://www.mulesoft.org/schema/mule/json" xmlns:http="http://www.mulesoft.org/schema/mule/http" xmlns:cloudhub="http://www.mulesoft.org/schema/mule/cloudhub" xmlns:tracking="http://www.mulesoft.org/schema/mule/ee/tracking" xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation" xmlns:spring="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd http://www.mulesoft.org/schema/mule/ee/tracking http://www.mulesoft.org/schema/mule/ee/tracking/current/mule-tracking-ee.xsd http://www.mulesoft.org/schema/mule/cloudhub http://www.mulesoft.org/schema/mule/cloudhub/current/mule-cloudhub.xsd http://www.mulesoft.org/schema/mule/json http://www.mulesoft.org/schema/mule/json/current/mule-json.xsd http://www.mulesoft.org/schema/mule/s3 http://www.mulesoft.org/schema/mule/s3/current/mule-s3.xsd http://www.mulesoft.org/schema/mule/batch http://www.mulesoft.org/schema/mule/batch/current/mule-batch.xsd"> <custom-transformer name="customZipTransformer" class="kloud.cloudhub.logarchiver.transformers.ZipTransformer" doc:name="Java"/> <http:request-config name="Access_Management_Config" protocol="HTTPS" host="anypoint.mulesoft.com" port="443" basePath="/accounts" doc:name="HTTP Request Configuration"> <http:raml-api-configuration location="access_management/api.raml"/> </http:request-config> <http:request-config name="CloudHub_Config" protocol="HTTPS" host="anypoint.mulesoft.com" port="443" basePath="/cloudhub/api" doc:name="HTTP Request Configuration"> <http:raml-api-configuration location="cloudhub/api.raml"/> </http:request-config> <s3:config name="Amazon_S3_Configuration" accessKey="${amazon.s3.access_key}" secretKey="${amazon.s3.secret_key}" doc:name="Amazon S3 Configuration"/> <flow name="logArchiverFlow"> <poll doc:name="Poll"> <fixed-frequency-scheduler frequency="${polling.frequency.hours}" timeUnit="HOURS"/> <set-payload value="#['${log.achiver.config}']" mimeType="application/json" doc:name="Read config"/> </poll> <json:json-to-object-transformer returnClass="java.util.HashMap" doc:name="JSON to Object"/> <set-variable variableName="configCollection" value="#[payload.config]" doc:name="Set configCollection flowVar"/> <foreach collection="#[flowVars.configCollection]" counterVariableName="configCounter" doc:name="For Each item in Config"> <set-variable variableName="config" value="#[flowVars.configCollection[configCounter-1]]" doc:name="Set config flowVar"/> <logger message="#['Archiving log files for CloudHub application: "' + flowVars.config.anypointApplication + '" to Amazon S3 bucket: "' + flowVars.config.amazonS3Bucket + '"...']" level="INFO" doc:name="Logger"/> <flow-ref name="archiveLogFile" doc:name="archiveLogFile"/> </foreach> <catch-exception-strategy doc:name="Catch Exception Strategy"> <logger level="ERROR" doc:name="Logger"/> </catch-exception-strategy> </flow> <sub-flow name="archiveLogFile"> <flow-ref name="cloudhubLogin" doc:name="cloudhubLogin"/> <flow-ref name="cloudhubDeployments" doc:name="cloudhubDeployments"/> <foreach collection="#[flowVars.instances]" counterVariableName="instanceCounter" doc:name="For Each deployed instance"> <set-variable variableName="instanceId" value="#[flowVars.instances[flowVars.instanceCounter-1].instanceId]" doc:name="Set InstanceId flowVar"/> <flow-ref name="cloudhubLogFiles" doc:name="cloudhubLogFiles"/> </foreach> </sub-flow> <sub-flow name="cloudhubLogin"> <set-payload value="#['{ "username": "${anypoint.login.username}", "password": "${anypoint.login.password}"}']" mimeType="application/json" doc:name="Set Payload"/> <http:request config-ref="Access_Management_Config" path="/login" method="POST" doc:name="HTTP"/> <json:json-to-object-transformer doc:name="JSON to Object" returnClass="java.util.HashMap"/> <set-variable variableName="access_token" value="#[payload.access_token]" doc:name="Set Access_Token FlowVar"/> <logger level="DEBUG" doc:name="Logger"/> </sub-flow> <sub-flow name="cloudhubDeployments"> <set-payload value="{}" mimeType="application/json" doc:name="Set Payload"/> <http:request config-ref="CloudHub_Config" path="/v2/applications/{domain}/deployments" method="GET" doc:name="HTTP"> <http:request-builder> <http:uri-param paramName="domain" value="#[flowVars.config.anypointApplication]"/> <http:header headerName="X-ANYPNT-ENV-ID" value="#[flowVars.config.anypointEnvironmentId]"/> <http:header headerName="Authorization" value="#['Bearer ' + flowVars.access_token]"/> </http:request-builder> </http:request> <json:json-to-object-transformer doc:name="JSON to Object" returnClass="java.util.HashMap"/> <set-variable variableName="instances" value="#[payload.data[0].instances]" doc:name="Set Instances FlowVar"/> <logger level="DEBUG" doc:name="Logger"/> </sub-flow> <sub-flow name="cloudhubLogFiles"> <set-payload value="{}" mimeType="application/json" doc:name="Set Payload"/> <http:request config-ref="CloudHub_Config" path="/v2/applications/{domain}/instances/{instanceId}/log-file" method="GET" doc:name="HTTP"> <http:request-builder> <http:uri-param paramName="domain" value="#[flowVars.config.anypointApplication]"/> <http:uri-param paramName="instanceId" value="#[flowVars.instanceId]"/> <http:header headerName="X-ANYPNT-ENV-ID" value="#[flowVars.config.anypointEnvironmentId]"/> <http:header headerName="Authorization" value="#['Bearer ' + flowVars.access_token]"/> </http:request-builder> </http:request> <transformer ref="customZipTransformer" doc:name="ZIP before sending"/> <s3:create-object config-ref="Amazon_S3_Configuration" bucketName="#[flowVars.config.amazonS3Bucket]" key="#[flowVars.config.anypointApplication + '-' + flowVars.instanceId + '-' + server.dateTime + '.zip']" doc:name="Amazon S3"/> </sub-flow> </mule>Once packaged and deployed to CloudHub we configure the solution to archive application logs for any deployed CloudHub app, even if they have been deployed into environments other than the one hosting the log archiver solution.
After running the solution for a day or so and checking the configured storage location we can confirm logs are being archived each day.
Known limitations:
- The Anypoint Management API does not allow downloading application logs for a given date range. That is, each time the solution runs a full copy of the application log will be downloaded. The API does support an operation to query the logs for a given date range and return matching entries as a result set but that comes with additional constraints on result set size (number of rows) and entry size (message truncation).
- The RAML definitions in Anypoint Exchange currently do not parse correctly in Anypoint Studio. As mentioned above, to work around this we download the RAML manually and bring it into the project ourselves.
- Credentials supplied in configuration are in plain text. Suggest creating a dedicated Anypoint account and granting permissions to only the target environments.
Lots of new things learned from this post. Thanks for sharing the information. keep posting on new things.
ReplyDeleteMulesoft Online Training
Mulesoft Training in Hyderabad
This is how my colleague Wesley Virgin's adventure begins with this SHOCKING and controversial video.
ReplyDeleteYou see, Wesley was in the army-and shortly after leaving-he unveiled hidden, "MIND CONTROL" secrets that the CIA and others used to obtain whatever they want.
THESE are the same tactics many famous people (especially those who "became famous out of nowhere") and top business people used to become wealthy and successful.
You probably know how you only use 10% of your brain.
Really, that's because the majority of your brain's power is UNCONSCIOUS.
Maybe this thought has even taken place IN YOUR very own head... as it did in my good friend Wesley Virgin's head seven years back, while driving an unregistered, trash bucket of a car without a driver's license and $3.20 on his debit card.
"I'm absolutely frustrated with living paycheck to paycheck! Why can't I become successful?"
You've taken part in those questions, isn't it so?
Your success story is waiting to happen. You need to start believing in YOURSELF.
Watch Wesley Virgin's Video Now!
Thanks for sharing this information, this is useful to me...
ReplyDeleteMulesoft Online Course
Mulesoft Online Training india
Thanks again for the article post.Really thank you! Fantastic.
ReplyDeleteMuleSoft online course
MuleSoft onlinetraining from india
your valuable information and time. Please keep updating.
ReplyDeleteMulesoft Course
Mulesoft Training in Hyderabad