{"files":{"SKILL.md":"---\nname: aws-iot-analytics\ndescription: \"AWS IoT Analytics API skill. Use when working with AWS IoT Analytics for messages, pipelines, channels. Covers 34 endpoints.\"\nversion: 1.0.0\ngenerator: lapsh\n---\n\n# AWS IoT Analytics\nAPI version: 2017-11-27\n\n## Auth\nAWS SigV4\n\n## Base URL\nNot specified.\n\n## Setup\n1. Configure auth: AWS SigV4\n2. GET /logging -- retrieves the current settings of the iot analytics logging options.\n3. POST /messages/batch -- create first batch\n\n## Endpoints\n34 endpoints across 8 groups. See references/api-spec.lap for full details.\n\n### Messages\n| Method | Path | Description |\n|--------|------|-------------|\n| POST | /messages/batch | Sends messages to a channel. |\n\n### Pipelines\n| Method | Path | Description |\n|--------|------|-------------|\n| DELETE | /pipelines/{pipelineName}/reprocessing/{reprocessingId} | Cancels the reprocessing of data through the pipeline. |\n| POST | /pipelines | Creates a pipeline. A pipeline consumes messages from a channel and allows you to process the messages before storing them in a data store. You must specify both a channel and a datastore activity and, optionally, as many as 23 additional activities in the pipelineActivities array. |\n| DELETE | /pipelines/{pipelineName} | Deletes the specified pipeline. |\n| GET | /pipelines/{pipelineName} | Retrieves information about a pipeline. |\n| GET | /pipelines | Retrieves a list of pipelines. |\n| POST | /pipelines/{pipelineName}/reprocessing | Starts the reprocessing of raw message data through the pipeline. |\n| PUT | /pipelines/{pipelineName} | Updates the settings of a pipeline. You must specify both a channel and a datastore activity and, optionally, as many as 23 additional activities in the pipelineActivities array. |\n\n### Channels\n| Method | Path | Description |\n|--------|------|-------------|\n| POST | /channels | Used to create a channel. A channel collects data from an MQTT topic and archives the raw, unprocessed messages before publishing the data to a pipeline. |\n| DELETE | /channels/{channelName} | Deletes the specified channel. |\n| GET | /channels/{channelName} | Retrieves information about a channel. |\n| GET | /channels | Retrieves a list of channels. |\n| GET | /channels/{channelName}/sample | Retrieves a sample of messages from the specified channel ingested during the specified timeframe. Up to 10 messages can be retrieved. |\n| PUT | /channels/{channelName} | Used to update the settings of a channel. |\n\n### Datasets\n| Method | Path | Description |\n|--------|------|-------------|\n| POST | /datasets | Used to create a dataset. A dataset stores data retrieved from a data store by applying a queryAction (a SQL query) or a containerAction (executing a containerized application). This operation creates the skeleton of a dataset. The dataset can be populated manually by calling CreateDatasetContent or automatically according to a trigger you specify. |\n| POST | /datasets/{datasetName}/content | Creates the content of a dataset by applying a queryAction (a SQL query) or a containerAction (executing a containerized application). |\n| DELETE | /datasets/{datasetName} | Deletes the specified dataset. You do not have to delete the content of the dataset before you perform this operation. |\n| DELETE | /datasets/{datasetName}/content | Deletes the content of the specified dataset. |\n| GET | /datasets/{datasetName} | Retrieves information about a dataset. |\n| GET | /datasets/{datasetName}/content | Retrieves the contents of a dataset as presigned URIs. |\n| GET | /datasets/{datasetName}/contents | Lists information about dataset contents that have been created. |\n| GET | /datasets | Retrieves information about datasets. |\n| PUT | /datasets/{datasetName} | Updates the settings of a dataset. |\n\n### Datastores\n| Method | Path | Description |\n|--------|------|-------------|\n| POST | /datastores | Creates a data store, which is a repository for messages. |\n| DELETE | /datastores/{datastoreName} | Deletes the specified data store. |\n| GET | /datastores/{datastoreName} | Retrieves information about a data store. |\n| GET | /datastores | Retrieves a list of data stores. |\n| PUT | /datastores/{datastoreName} | Used to update the settings of a data store. |\n\n### Logging\n| Method | Path | Description |\n|--------|------|-------------|\n| GET | /logging | Retrieves the current settings of the IoT Analytics logging options. |\n| PUT | /logging | Sets or updates the IoT Analytics logging options. If you update the value of any loggingOptions field, it takes up to one minute for the change to take effect. Also, if you change the policy attached to the role you specified in the roleArn field (for example, to correct an invalid policy), it takes up to five minutes for that change to take effect. |\n\n### Tags\n| Method | Path | Description |\n|--------|------|-------------|\n| GET | /tags | Lists the tags (metadata) that you have assigned to the resource. |\n| POST | /tags | Adds to or modifies the tags of the given resource. Tags are metadata that can be used to manage a resource. |\n| DELETE | /tags | Removes the given tags (metadata) from the resource. |\n\n### Pipelineactivities\n| Method | Path | Description |\n|--------|------|-------------|\n| POST | /pipelineactivities/run | Simulates the results of running a pipeline activity on a message payload. |\n\n## Common Questions\nMatch user requests to endpoints in references/api-spec.lap. Key patterns:\n- \"Create a batch?\" -> POST /messages/batch\n- \"Delete a reprocessing?\" -> DELETE /pipelines/{pipelineName}/reprocessing/{reprocessingId}\n- \"Create a channel?\" -> POST /channels\n- \"Create a dataset?\" -> POST /datasets\n- \"Create a content?\" -> POST /datasets/{datasetName}/content\n- \"Create a datastore?\" -> POST /datastores\n- \"Create a pipeline?\" -> POST /pipelines\n- \"Delete a channel?\" -> DELETE /channels/{channelName}\n- \"Delete a dataset?\" -> DELETE /datasets/{datasetName}\n- \"Delete a datastore?\" -> DELETE /datastores/{datastoreName}\n- \"Delete a pipeline?\" -> DELETE /pipelines/{pipelineName}\n- \"Get channel details?\" -> GET /channels/{channelName}\n- \"Get dataset details?\" -> GET /datasets/{datasetName}\n- \"Get datastore details?\" -> GET /datastores/{datastoreName}\n- \"List all logging?\" -> GET /logging\n- \"Get pipeline details?\" -> GET /pipelines/{pipelineName}\n- \"List all content?\" -> GET /datasets/{datasetName}/content\n- \"List all channels?\" -> GET /channels\n- \"List all contents?\" -> GET /datasets/{datasetName}/contents\n- \"List all datasets?\" -> GET /datasets\n- \"List all datastores?\" -> GET /datastores\n- \"List all pipelines?\" -> GET /pipelines\n- \"List all tags?\" -> GET /tags\n- \"Create a run?\" -> POST /pipelineactivities/run\n- \"List all sample?\" -> GET /channels/{channelName}/sample\n- \"Create a reprocessing?\" -> POST /pipelines/{pipelineName}/reprocessing\n- \"Create a tag?\" -> POST /tags\n- \"Update a channel?\" -> PUT /channels/{channelName}\n- \"Update a dataset?\" -> PUT /datasets/{datasetName}\n- \"Update a datastore?\" -> PUT /datastores/{datastoreName}\n- \"Update a pipeline?\" -> PUT /pipelines/{pipelineName}\n- \"How to authenticate?\" -> See Auth section above\n\n## Response Tips\n- Check response schemas in references/api-spec.lap for field details\n- Create/update endpoints return the modified resource on success\n\n## References\n- Full spec: See references/api-spec.lap for complete endpoint details, parameter tables, and response schemas\n\n> Generated from the official API spec by [LAP](https://lap.sh)\n","references/api-spec.lap":"@lap v0.3\n# Machine-readable API spec. Each @endpoint block is one API call.\n@api AWS IoT Analytics\n@version 2017-11-27\n@auth AWS SigV4\n@endpoints 34\n@hint download_for_search\n@toc messages(1), pipelines(7), channels(6), datasets(9), datastores(5), logging(2), tags(3), pipelineactivities(1)\n\n@group messages\n@endpoint POST /messages/batch\n@desc Sends messages to a channel.\n@required {channelName: str, messages: [Message]}\n@returns(200) {batchPutMessageErrorEntries: [BatchPutMessageErrorEntry]?}\n\n@endgroup\n\n@group pipelines\n@endpoint DELETE /pipelines/{pipelineName}/reprocessing/{reprocessingId}\n@desc Cancels the reprocessing of data through the pipeline.\n@required {pipelineName: str, reprocessingId: str}\n\n@endgroup\n\n@group channels\n@endpoint POST /channels\n@desc Used to create a channel. A channel collects data from an MQTT topic and archives the raw, unprocessed messages before publishing the data to a pipeline.\n@required {channelName: str}\n@optional {channelStorage: ChannelStorage, retentionPeriod: RetentionPeriod, tags: [Tag]}\n@returns(200) {channelName: str?, channelArn: str?, retentionPeriod: RetentionPeriod?{unlimited: bool?, numberOfDays: int?}}\n\n@endgroup\n\n@group datasets\n@endpoint POST /datasets\n@desc Used to create a dataset. A dataset stores data retrieved from a data store by applying a queryAction (a SQL query) or a containerAction (executing a containerized application). This operation creates the skeleton of a dataset. The dataset can be populated manually by calling CreateDatasetContent or automatically according to a trigger you specify.\n@required {datasetName: str, actions: [DatasetAction]}\n@optional {triggers: [DatasetTrigger], contentDeliveryRules: [DatasetContentDeliveryRule], retentionPeriod: RetentionPeriod, versioningConfiguration: VersioningConfiguration, tags: [Tag], lateDataRules: [LateDataRule]}\n@returns(200) {datasetName: str?, datasetArn: str?, retentionPeriod: RetentionPeriod?{unlimited: bool?, numberOfDays: int?}}\n\n@endpoint POST /datasets/{datasetName}/content\n@desc Creates the content of a dataset by applying a queryAction (a SQL query) or a containerAction (executing a containerized application).\n@required {datasetName: str}\n@optional {versionId: str}\n@returns(200) {versionId: str?}\n\n@endgroup\n\n@group datastores\n@endpoint POST /datastores\n@desc Creates a data store, which is a repository for messages.\n@required {datastoreName: str}\n@optional {datastoreStorage: DatastoreStorage, retentionPeriod: RetentionPeriod, tags: [Tag], fileFormatConfiguration: FileFormatConfiguration, datastorePartitions: DatastorePartitions}\n@returns(200) {datastoreName: str?, datastoreArn: str?, retentionPeriod: RetentionPeriod?{unlimited: bool?, numberOfDays: int?}}\n\n@endgroup\n\n@group pipelines\n@endpoint POST /pipelines\n@desc Creates a pipeline. A pipeline consumes messages from a channel and allows you to process the messages before storing them in a data store. You must specify both a channel and a datastore activity and, optionally, as many as 23 additional activities in the pipelineActivities array.\n@required {pipelineName: str, pipelineActivities: [PipelineActivity]}\n@optional {tags: [Tag]}\n@returns(200) {pipelineName: str?, pipelineArn: str?}\n\n@endgroup\n\n@group channels\n@endpoint DELETE /channels/{channelName}\n@desc Deletes the specified channel.\n@required {channelName: str}\n\n@endgroup\n\n@group datasets\n@endpoint DELETE /datasets/{datasetName}\n@desc Deletes the specified dataset. You do not have to delete the content of the dataset before you perform this operation.\n@required {datasetName: str}\n\n@endpoint DELETE /datasets/{datasetName}/content\n@desc Deletes the content of the specified dataset.\n@required {datasetName: str}\n@optional {versionId: str}\n\n@endgroup\n\n@group datastores\n@endpoint DELETE /datastores/{datastoreName}\n@desc Deletes the specified data store.\n@required {datastoreName: str}\n\n@endgroup\n\n@group pipelines\n@endpoint DELETE /pipelines/{pipelineName}\n@desc Deletes the specified pipeline.\n@required {pipelineName: str}\n\n@endgroup\n\n@group channels\n@endpoint GET /channels/{channelName}\n@desc Retrieves information about a channel.\n@required {channelName: str}\n@optional {includeStatistics: bool}\n@returns(200) {channel: Channel?{name: str?, storage: ChannelStorage?{serviceManagedS3: ServiceManagedChannelS3Storage?, customerManagedS3: CustomerManagedChannelS3Storage?{bucket: str, keyPrefix: str?, roleArn: str}}, arn: str?, status: str?, retentionPeriod: RetentionPeriod?{unlimited: bool?, numberOfDays: int?}, creationTime: str(timestamp)?, lastUpdateTime: str(timestamp)?, lastMessageArrivalTime: str(timestamp)?}, statistics: ChannelStatistics?{size: EstimatedResourceSize?{estimatedSizeInBytes: num(f64)?, estimatedOn: str(timestamp)?}}}\n\n@endgroup\n\n@group datasets\n@endpoint GET /datasets/{datasetName}\n@desc Retrieves information about a dataset.\n@required {datasetName: str}\n@returns(200) {dataset: Dataset?{name: str?, arn: str?, actions: [DatasetAction]?, triggers: [DatasetTrigger]?, contentDeliveryRules: [DatasetContentDeliveryRule]?, status: str?, creationTime: str(timestamp)?, lastUpdateTime: str(timestamp)?, retentionPeriod: RetentionPeriod?{unlimited: bool?, numberOfDays: int?}, versioningConfiguration: VersioningConfiguration?{unlimited: bool?, maxVersions: int?}, lateDataRules: [LateDataRule]?}}\n\n@endgroup\n\n@group datastores\n@endpoint GET /datastores/{datastoreName}\n@desc Retrieves information about a data store.\n@required {datastoreName: str}\n@optional {includeStatistics: bool}\n@returns(200) {datastore: Datastore?{name: str?, storage: DatastoreStorage?{serviceManagedS3: ServiceManagedDatastoreS3Storage?, customerManagedS3: CustomerManagedDatastoreS3Storage?{bucket: str, keyPrefix: str?, roleArn: str}, iotSiteWiseMultiLayerStorage: DatastoreIotSiteWiseMultiLayerStorage?{customerManagedS3Storage: IotSiteWiseCustomerManagedDatastoreS3Storage}}, arn: str?, status: str?, retentionPeriod: RetentionPeriod?{unlimited: bool?, numberOfDays: int?}, creationTime: str(timestamp)?, lastUpdateTime: str(timestamp)?, lastMessageArrivalTime: str(timestamp)?, fileFormatConfiguration: FileFormatConfiguration?{jsonConfiguration: JsonConfiguration?, parquetConfiguration: ParquetConfiguration?{schemaDefinition: SchemaDefinition?}}, datastorePartitions: DatastorePartitions?{partitions: [DatastorePartition]?}}, statistics: DatastoreStatistics?{size: EstimatedResourceSize?{estimatedSizeInBytes: num(f64)?, estimatedOn: str(timestamp)?}}}\n\n@endgroup\n\n@group logging\n@endpoint GET /logging\n@desc Retrieves the current settings of the IoT Analytics logging options.\n@returns(200) {loggingOptions: LoggingOptions?{roleArn: str, level: str, enabled: bool}}\n\n@endgroup\n\n@group pipelines\n@endpoint GET /pipelines/{pipelineName}\n@desc Retrieves information about a pipeline.\n@required {pipelineName: str}\n@returns(200) {pipeline: Pipeline?{name: str?, arn: str?, activities: [PipelineActivity]?, reprocessingSummaries: [ReprocessingSummary]?, creationTime: str(timestamp)?, lastUpdateTime: str(timestamp)?}}\n\n@endgroup\n\n@group datasets\n@endpoint GET /datasets/{datasetName}/content\n@desc Retrieves the contents of a dataset as presigned URIs.\n@required {datasetName: str}\n@optional {versionId: str}\n@returns(200) {entries: [DatasetEntry]?, timestamp: str(timestamp)?, status: DatasetContentStatus?{state: str?, reason: str?}}\n\n@endgroup\n\n@group channels\n@endpoint GET /channels\n@desc Retrieves a list of channels.\n@optional {nextToken: str, maxResults: int}\n@returns(200) {channelSummaries: [ChannelSummary]?, nextToken: str?}\n\n@endgroup\n\n@group datasets\n@endpoint GET /datasets/{datasetName}/contents\n@desc Lists information about dataset contents that have been created.\n@required {datasetName: str}\n@optional {nextToken: str, maxResults: int, scheduledOnOrAfter: str(timestamp), scheduledBefore: str(timestamp)}\n@returns(200) {datasetContentSummaries: [DatasetContentSummary]?, nextToken: str?}\n\n@endpoint GET /datasets\n@desc Retrieves information about datasets.\n@optional {nextToken: str, maxResults: int}\n@returns(200) {datasetSummaries: [DatasetSummary]?, nextToken: str?}\n\n@endgroup\n\n@group datastores\n@endpoint GET /datastores\n@desc Retrieves a list of data stores.\n@optional {nextToken: str, maxResults: int}\n@returns(200) {datastoreSummaries: [DatastoreSummary]?, nextToken: str?}\n\n@endgroup\n\n@group pipelines\n@endpoint GET /pipelines\n@desc Retrieves a list of pipelines.\n@optional {nextToken: str, maxResults: int}\n@returns(200) {pipelineSummaries: [PipelineSummary]?, nextToken: str?}\n\n@endgroup\n\n@group tags\n@endpoint GET /tags\n@desc Lists the tags (metadata) that you have assigned to the resource.\n@required {resourceArn: str}\n@returns(200) {tags: [Tag]?}\n\n@endgroup\n\n@group logging\n@endpoint PUT /logging\n@desc Sets or updates the IoT Analytics logging options. If you update the value of any loggingOptions field, it takes up to one minute for the change to take effect. Also, if you change the policy attached to the role you specified in the roleArn field (for example, to correct an invalid policy), it takes up to five minutes for that change to take effect.\n@required {loggingOptions: LoggingOptions}\n\n@endgroup\n\n@group pipelineactivities\n@endpoint POST /pipelineactivities/run\n@desc Simulates the results of running a pipeline activity on a message payload.\n@required {pipelineActivity: PipelineActivity, payloads: [bytes]}\n@returns(200) {payloads: [bytes]?, logResult: str?}\n\n@endgroup\n\n@group channels\n@endpoint GET /channels/{channelName}/sample\n@desc Retrieves a sample of messages from the specified channel ingested during the specified timeframe. Up to 10 messages can be retrieved.\n@required {channelName: str}\n@optional {maxMessages: int, startTime: str(timestamp), endTime: str(timestamp)}\n@returns(200) {payloads: [bytes]?}\n\n@endgroup\n\n@group pipelines\n@endpoint POST /pipelines/{pipelineName}/reprocessing\n@desc Starts the reprocessing of raw message data through the pipeline.\n@required {pipelineName: str}\n@optional {startTime: str(timestamp), endTime: str(timestamp), channelMessages: ChannelMessages}\n@returns(200) {reprocessingId: str?}\n\n@endgroup\n\n@group tags\n@endpoint POST /tags\n@desc Adds to or modifies the tags of the given resource. Tags are metadata that can be used to manage a resource.\n@required {resourceArn: str, tags: [Tag]}\n\n@endpoint DELETE /tags\n@desc Removes the given tags (metadata) from the resource.\n@required {resourceArn: str, tagKeys: [str]}\n\n@endgroup\n\n@group channels\n@endpoint PUT /channels/{channelName}\n@desc Used to update the settings of a channel.\n@required {channelName: str}\n@optional {channelStorage: ChannelStorage, retentionPeriod: RetentionPeriod}\n\n@endgroup\n\n@group datasets\n@endpoint PUT /datasets/{datasetName}\n@desc Updates the settings of a dataset.\n@required {datasetName: str, actions: [DatasetAction]}\n@optional {triggers: [DatasetTrigger], contentDeliveryRules: [DatasetContentDeliveryRule], retentionPeriod: RetentionPeriod, versioningConfiguration: VersioningConfiguration, lateDataRules: [LateDataRule]}\n\n@endgroup\n\n@group datastores\n@endpoint PUT /datastores/{datastoreName}\n@desc Used to update the settings of a data store.\n@required {datastoreName: str}\n@optional {retentionPeriod: RetentionPeriod, datastoreStorage: DatastoreStorage, fileFormatConfiguration: FileFormatConfiguration}\n\n@endgroup\n\n@group pipelines\n@endpoint PUT /pipelines/{pipelineName}\n@desc Updates the settings of a pipeline. You must specify both a channel and a datastore activity and, optionally, as many as 23 additional activities in the pipelineActivities array.\n@required {pipelineName: str, pipelineActivities: [PipelineActivity]}\n\n@endgroup\n\n@end\n"}}