{"files":{"SKILL.md":"---\nname: warehouse-connectors-api\ndescription: \"Warehouse Connectors API skill. Use when working with Warehouse Connectors for projects. Covers 10 endpoints.\"\nversion: 1.0.0\ngenerator: lapsh\n---\n\n# Warehouse Connectors API\nAPI version: 1.0.0\n\n## Auth\nNo authentication required.\n\n## Base URL\nNot specified.\n\n## Setup\n1. No auth setup needed\n2. GET /projects/{projectId}/warehouse-sources/imports -- list all warehouse imports\n3. POST /projects/{projectId}/warehouse-sources/imports/event-stream -- create first event-stream\n\n## Endpoints\n10 endpoints across 1 group. See references/api-spec.lap for full details.\n\n### Projects\n| Method | Path | Description |\n|--------|------|-------------|\n| GET | /projects/{projectId}/warehouse-sources/imports | List all warehouse imports |\n| GET | /projects/{projectId}/warehouse-sources/imports/{importId} | Get a specific warehouse import |\n| PATCH | /projects/{projectId}/warehouse-sources/imports/{importId} | Update a warehouse import |\n| DELETE | /projects/{projectId}/warehouse-sources/imports/{importId} | Delete a warehouse import |\n| POST | /projects/{projectId}/warehouse-sources/imports/event-stream | Create an event stream import |\n| POST | /projects/{projectId}/warehouse-sources/imports/people | Create a people (user profiles) import |\n| POST | /projects/{projectId}/warehouse-sources/imports/groups | Create a groups import |\n| POST | /projects/{projectId}/warehouse-sources/imports/lookup-table | Create a lookup table import |\n| PUT | /projects/{projectId}/warehouse-sources/imports/{importId}/manual-sync | Run an import |\n| GET | /projects/{projectId}/warehouse-sources/imports/{importId}/history | Get import job history |\n\n## Common Questions\nMatch user requests to endpoints in references/api-spec.lap. Key patterns:\n- \"List all imports?\" -> GET /projects/{projectId}/warehouse-sources/imports\n- \"Get import details?\" -> GET /projects/{projectId}/warehouse-sources/imports/{importId}\n- \"Partially update a import?\" -> PATCH /projects/{projectId}/warehouse-sources/imports/{importId}\n- \"Delete a import?\" -> DELETE /projects/{projectId}/warehouse-sources/imports/{importId}\n- \"Create a event-stream?\" -> POST /projects/{projectId}/warehouse-sources/imports/event-stream\n- \"Create a people?\" -> POST /projects/{projectId}/warehouse-sources/imports/people\n- \"Create a group?\" -> POST /projects/{projectId}/warehouse-sources/imports/groups\n- \"Create a lookup-table?\" -> POST /projects/{projectId}/warehouse-sources/imports/lookup-table\n- \"List all history?\" -> GET /projects/{projectId}/warehouse-sources/imports/{importId}/history\n\n## Response Tips\n- Check response schemas in references/api-spec.lap for field details\n- Create/update endpoints return the modified resource on success\n- Error responses include status codes and descriptions in the spec\n\n## References\n- Full spec: See references/api-spec.lap for complete endpoint details, parameter tables, and response schemas\n\n> Generated from the official API spec by [LAP](https://lap.sh)\n","references/api-spec.lap":"@lap v0.3\n# Machine-readable API spec. Each @endpoint block is one API call.\n@api Warehouse Connectors API\n@version 1.0.0\n@endpoints 10\n@toc projects(10)\n\n@endpoint GET /projects/{projectId}/warehouse-sources/imports\n@desc List all warehouse imports\n@returns(200) {status: any, results: [map]} # Success\n@errors {401, 403}\n\n@endpoint GET /projects/{projectId}/warehouse-sources/imports/{importId}\n@desc Get a specific warehouse import\n@returns(200) {status: any, results: any} # Success\n@errors {401, 403, 404}\n\n@endpoint PATCH /projects/{projectId}/warehouse-sources/imports/{importId}\n@desc Update a warehouse import\n@required {paused: bool # Whether to pause (true) or resume (false) the import, run_every: int(0/3600000000000/86400000000000/604800000000000) # Sync frequency in nanoseconds. Only these values are accepted: - `0` - API-triggered only (use the manual-sync endpoint to trigger) - `3600000000000` - Hourly - `86400000000000` - Daily - `604800000000000` - Weekly}\n@optional {databricks_params: map{export_cluster_config: map}}\n@returns(200) {status: any, results: any} # Success\n@errors {400, 401, 403, 404}\n\n@endpoint DELETE /projects/{projectId}/warehouse-sources/imports/{importId}\n@desc Delete a warehouse import\n@optional {delete_data: bool=false # Whether to also delete the imported data from Mixpanel}\n@returns(200) {status: any} # Success\n@errors {401, 403, 404}\n\n@endpoint POST /projects/{projectId}/warehouse-sources/imports/event-stream\n@desc Create an event stream import\n@required {import_type: str, warehouse_source_id: int, table_params: map # Table location parameters (structure depends on warehouse type), time_column_name: str, sync_mode: str(time_based/mirror_mode/full_sync/one_time)}\n@optional {event_name: str, event_column_name: str, user_column_name: str, company_column_name: str # Required for B2B projects. The column containing the company identifier., device_column_name: str, json_properties_column_name: str, run_every: int(0/3600000000000/86400000000000/604800000000000) # Sync frequency in nanoseconds. Only these values are accepted: - `0` - API-triggered only (use the manual-sync endpoint to trigger) - `3600000000000` - Hourly - `86400000000000` - Daily - `604800000000000` - Weekly, insert_time_column_name: str, property_mappings: map, databricks_params: map{export_cluster_config: map}}\n@returns(200) {status: any, results: any} # Success\n@errors {400, 401, 403}\n\n@endpoint POST /projects/{projectId}/warehouse-sources/imports/people\n@desc Create a people (user profiles) import\n@required {import_type: str, warehouse_source_id: int, table_params: map # Table location parameters (structure depends on warehouse type), user_column_name: str, sync_mode: str(time_based/mirror_mode/full_sync/one_time)}\n@optional {json_properties_column_name: str, run_every: int(0/3600000000000/86400000000000/604800000000000) # Sync frequency in nanoseconds. Only these values are accepted: - `0` - API-triggered only (use the manual-sync endpoint to trigger) - `3600000000000` - Hourly - `86400000000000` - Daily - `604800000000000` - Weekly, insert_time_column_name: str, property_mappings: map, databricks_params: map{export_cluster_config: map}}\n@returns(200) {status: any, results: any} # Success\n@errors {400, 401, 403}\n\n@endpoint POST /projects/{projectId}/warehouse-sources/imports/groups\n@desc Create a groups import\n@required {import_type: str, warehouse_source_id: int, table_params: map # Table location parameters (structure depends on warehouse type), group_key: str, group_id_column: str, sync_mode: str(time_based/mirror_mode/full_sync/one_time)}\n@optional {run_every: int(0/3600000000000/86400000000000/604800000000000) # Sync frequency in nanoseconds. Only these values are accepted: - `0` - API-triggered only (use the manual-sync endpoint to trigger) - `3600000000000` - Hourly - `86400000000000` - Daily - `604800000000000` - Weekly, insert_time_column_name: str, json_properties_column_name: str, property_mappings: map, databricks_params: map{export_cluster_config: map}}\n@returns(200) {status: any, results: any} # Success\n@errors {400, 401, 403}\n\n@endpoint POST /projects/{projectId}/warehouse-sources/imports/lookup-table\n@desc Create a lookup table import\n@required {import_type: str, warehouse_source_id: int, table_params: map # Table location parameters (structure depends on warehouse type), mixpanel_property: map{value!: str, resourceType!: str}, property_key_column_name: str, sync_mode: str(full_sync/one_time)}\n@optional {run_every: int(0/3600000000000/86400000000000/604800000000000) # Sync frequency in nanoseconds. Only these values are accepted: - `0` - API-triggered only (use the manual-sync endpoint to trigger) - `3600000000000` - Hourly - `86400000000000` - Daily - `604800000000000` - Weekly, databricks_params: map{export_cluster_config: map}}\n@returns(200) {status: any, results: any} # Success\n@errors {400, 401, 403}\n\n@endpoint PUT /projects/{projectId}/warehouse-sources/imports/{importId}/manual-sync\n@desc Run an import\n@returns(200) {status: any, results: map{run_id: str}} # Success\n@errors {401, 403, 404}\n\n@endpoint GET /projects/{projectId}/warehouse-sources/imports/{importId}/history\n@desc Get import job history\n@returns(200) {status: any, results: map{runs: [map]}} # Success\n@errors {401, 403, 404}\n\n@end\n"}}