How to reach a Google Storage Bucket from an Azure Function

Tanguy SCHOUBERT
Published by Tanguy SCHOUBERT
Category : .Net / Azure / Azure Functions
07/06/2021

In the context of developing a data flow, we need to connect to a Google Storage Bucket. Since there is no Logic App connector to do this easily, we will do it from an Azure Function. The idea is to access the bucket to retrieve all the files in it, process them, and then, move them to an archive folder in that same bucket. We will go through the necessary steps to do this.

 

Requirements

 

The NuGet package Google.Cloud.Storage.V1 is required. The current version is 3.3.0. In our function, we will reference these two namespaces:

  • Google.Apis.Auth.OAuth2
  • Google.Cloud.Storage.V1

 

Connecting to Google Storage

 

We must then format all the data containing the credentials in a specific JSON format:

 

string _credsTemplate = "{{"
                                    + "\"type\": \"service_account\","
                                    + "\"project_id\": \"{0}\","
                                    + "\"private_key_id\": \"{1}\","
                                    + "\"private_key\": \"{2}\","
                                    + "\"client_email\": \"{3}\","
                                    + "\"client_id\": \"{4}\","
                                    + "\"auth_uri\": \"https://accounts.google.com/o/oauth2/auth\","
                                    + "\"token_uri\": \"https://oauth2.googleapis.com/token\","
                                    + "\"auth_provider_x509_cert_url\": \"https://www.googleapis.com/oauth2/v1/certs\","
                                    + "\"client_x509_cert_url\": \"https://www.googleapis.com/robot/v1/metadata/x509/example.iam.gserviceaccount.com\""
                                    + "}}";

 

Once the parameters have been replaced in a new googleCreds string, we can connect using the following two lines:

 

GoogleCredential credential = GoogleCredential.FromJson(googleCreds);
StorageClientstorage = StorageClient.Create(credential);

 

Retrieving files

 

We can now retrieve the files from the bucket:

 

// For each file in the bucket
foreach (var storageObject in storage.ListObjects(bucketName, folderName))
{
    string fileContent = string.Empty;
    using (Stream stream = new MemoryStream())
    {
        storage.DownloadObject(bucketName, storageObject.Name, stream);
        stream.Seek(0, SeekOrigin.Begin);
        using (StreamReader reader = new StreamReader(stream))
        {
            fileContent = reader.ReadToEnd();
        }
    }
    // Do whatever with the file content
}

 

Archiving in the bucket

 

After processing each file, we can imagine moving it to an archive folder by reuploading it from the retrieved fileContent and deleting the initial file:

 

byte[] byteArray = Encoding.ASCII.GetBytes(fileContent);
using (Stream stream = new MemoryStream(byteArray))
{
    storage.UploadObject(bucketName, "mypath/archives/" + storageObject.Name, null, stream);
    storage.DeleteObject(bucketName, "mypath/" + storageObject.Name);
}