Anya
Setting up a trigger when integrating with BigQuery
thank you! What about the Kafka triggers?
The process that I wanted to set up is the following.
I wanted to create a Pub/Sub topic in BigQuery and a cloud function that will listen to it and send a webhook to Windmill if certain events are happening (like, updates/deletions).
So would setting a Kafka trigger do the same job ?
8 replies
Writing data to external Azure Blob Storage
@Tiago Serafim Thanks!
The first point makes sense, but when I run my function, I get the following error message which is related to installing packages:
RuntimeError:
Starting with v5.0.0, the 'azure' meta-package is deprecated and cannot
be installed anymore.
Please install the service specific packages prefixed by
azure needed
for your application.
The complete list of available packages can be found at:
https://aka.ms/azsdk/python/all
Here's a non-exhaustive list of common packages:
- azure-mgmt-compute (https://pypi.python.org/pypi/azure-mgmt-compute)
: Management of Virtual Machines, etc.
- azure-mgmt-storage (https://pypi.python.org/pypi/azure-mgmt-storage)
: Management of storage accounts.
- azure-mgmt-resource (https://pypi.python.org/pypi/azure-mgmt-resource)
: Generic package about Azure Resource Management (ARM)
- azure-keyvault-secrets
(https://pypi.python.org/pypi/azure-keyvault-secrets) : Access to
secrets in Key Vault
- azure-storage-blob (https://pypi.python.org/pypi/azure-storage-blob)
: Access to blobs in storage accounts
A more comprehensive discussion of the rationale for this decision can
be found in the following issue:
https://github.com/Azure/azure-sdk-for-python/issues/10646
hint: This usually indicates a problem with the package or the build
environment.
Any thoughts?4 replies