fictionco / fiction

(Public Release Summer 2024) Personal Marketing Platform. A powerful platform for your online identity.

Home Page:https://www.fiction.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Plugin to use Microsoft Azure Blob Storage with Factor

eagerestwolf opened this issue · comments

I have created a plugin to use Microsoft Azure Blob Storage for Factor's images instead of Amazon S3. I have linked the NPM package and the git repository below...sorry if it looks like I borrowed heavily from the actual Factor code (I did), the plugin documentation isn't super great just yet, so I had to go by example.

NPM: https://www.npmjs.com/package/@modern-classic/factor-plugin-storage-azure
Github: https://github.com/modern-classic/factor-plugin-storage-azure

Disclaimer: I have not tested it yet...it's a bit late here on the east coast of the US, so I will get to testing it first thing tomorrow.

Awesome. Nothing wrong with borrowing heavily :)

What do you feel would be helpful to add to the documentation? A list of filters, etc.?

We're working on extensions listings right now, happy to add this along with backlinks and so on. All that's needed is the metainfo and screenshots:

https://factor.dev/docs/extension-guidelines

I will get on the information for the extension guidelines as soon as I can. In terms of doucmentation, I would say a list of hooks that developers can hook into would be nice; along with some simple samples showing how a plugin can be setup. The documentation is there for frontend plugins, but not backend. Something like this so developers understand how to get started:

import { addFilter, addCallback } from '@factor/api';

// All plugins require a setup function to establish hooks, callbacks, etc.
// This function is passed no arguments and should return void.
export function setup(): void {
  // Check to see if your plugin needs setup
  if (!process.env.FACTOR_MY_PLUGIN_SMART) {
    // Tell Factor that your plugin requires setup
    addFilter({
      // The key is used by Factor to track your plugin. This should be unique
      key: 'myPluginSetup',

      // The `setup-needed` hook tells Factor that your plugin needs to be setup
      hook: 'setup-needed',

      // The callback is passed a single argument, an array of events. For this we use the type `{ title: string }[]`.
      // You should append your event to the array, preferably with spread syntax so it's synchronous.
      callback: (__: { title: string }[]) => {
        return [
          ...__,
          {
            // The title is used to tell the user what your plugin needs
            // For example, if your plugin needs a smart variable, you might say:
            title: 'My Plugin: Needs smart variable',

            // The file property tells the user which file they can edit to add this variable
            file: '.env',

            // The name property tells the user what the variable is called so they can add it
            name: 'FACTOR_MY_PLUGIN_SMART',
          },
        ];
      },
    });

    // Since your plugin requires setup, you should return from the setup function
    // This is so that no filters or callbacks get registered, thus causing errors
    return;
  }

  // Add your filters and callbacks here
}

// Don't forget to call your setup function at the end of your file to register
// everything
setup();

Just a heads up, things are kind of on pause. My plugin had a few issues (I expected as much). The main one being that I cannot get the URL returned from the storage-attachment-url to work at the moment. I'm not sure if this is on Factor's end or (more likely) Azure's end. The files do get uploaded and deleted without issue though.

If you can get a URL back from a standard buffer then the rest should be trivial.

In the case of s3, that's this code:

  const params = {
          Bucket: bucket,
          Key: key,
          Body: buffer,
          ACL: "public-read",
          ContentType: mimetype,
        }

S3.upload(params, (error: Error, data: { Location: string }) => {

// Location is the URL 
 if (error) reject(error)

  const { Location } = data || {}

  resolve(Location)
})

From there the rest of the code can be identical to the s3 stuff.

Yeah, I think my issue is that I'm awaiting the upload (similar to how the Google Cloud plugin does things) because Google Cloud and Microsoft Azure have similar API's (in the sense that they both use Promises to handle the upload). The issue is that, for whatever reason, Microsoft's API doesn't give the URL in the response for the upload, so I have to wait for the upload to finish then return the URL directly. Additionally, I can't just build the URL manually because Azure storage allows the user to set a custom domain, so I need to do some investigation on my end. It shouldn't take too long, but I do have a couple of other things I need to do for my business while I still have some downtime, so it's on the back burner for now (probably no more than a day or two).

Following this, would like to see Azure storage integration. Let me know if I can help!

Well, it's not throwing errors anymore, but I don't know if I'm doing something wrong or if this is how Factor is intended to work; but nothing is being uploaded to Azure. I tried adding a new page and a new blog post. I also tried attaching the images as an avatar, attachment and dragging them into the post. I used 2 images: a low resolution JPEG downloaded from Facebook (71 KB) and a high resolution stock photo from Unsplash (2.4 MB). The images will work when uploaded as an avatar or an attachment, but when I tested on the page, the URL shortcode doesn't work. It literally will just write {{key}}.url to the html, not the actual URL. Additionally, no matter which image I use, if I drag and drop the image directly into the post express complains the entity is too large, as shown below. I don't think this is related to my plugin as Azure has a file size limit of 4.77TB.

PayloadTooLargeError: request entity too large
    at readStream (C:\Users\Seth\Projects\website\node_modules\raw-body\index.js:155:17)
    at getRawBody (C:\Users\Seth\Projects\website\node_modules\raw-body\index.js:108:12)
    at read (C:\Users\Seth\Projects\website\node_modules\body-parser\lib\read.js:77:3)
    at jsonParser (C:\Users\Seth\Projects\website\node_modules\body-parser\lib\types\json.js:135:5)
    at Layer.handle [as handle_request] (C:\Users\Seth\Projects\website\node_modules\express\lib\router\layer.js:95:5)
    at trim_prefix (C:\Users\Seth\Projects\website\node_modules\express\lib\router\index.js:317:13)
    at C:\Users\Seth\Projects\website\node_modules\express\lib\router\index.js:284:7
    at Function.process_params (C:\Users\Seth\Projects\website\node_modules\express\lib\router\index.js:335:12)
    at next (C:\Users\Seth\Projects\website\node_modules\express\lib\router\index.js:275:10)
    at urlencodedParser (C:\Users\Seth\Projects\website\node_modules\body-parser\lib\types\urlencoded.js:100:7)
    at Layer.handle [as handle_request] (C:\Users\Seth\Projects\website\node_modules\express\lib\router\layer.js:95:5)
    at trim_prefix (C:\Users\Seth\Projects\website\node_modules\express\lib\router\index.js:317:13)
    at C:\Users\Seth\Projects\website\node_modules\express\lib\router\index.js:284:7
    at Function.process_params (C:\Users\Seth\Projects\website\node_modules\express\lib\router\index.js:335:12)
    at next (C:\Users\Seth\Projects\website\node_modules\express\lib\router\index.js:275:10)
    at internalNext (C:\Users\Seth\Projects\website\node_modules\helmet\index.js:47:33)
    at xXssProtection (C:\Users\Seth\Projects\website\node_modules\x-xss-protection\dist\index.js:47:13)
    at internalNext (C:\Users\Seth\Projects\website\node_modules\helmet\index.js:51:7)
    at nosniff (C:\Users\Seth\Projects\website\node_modules\dont-sniff-mimetype\dist\index.js:5:9)
    at internalNext (C:\Users\Seth\Projects\website\node_modules\helmet\index.js:51:7)
    at ienoopen (C:\Users\Seth\Projects\website\node_modules\ienoopen\dist\index.js:5:9)
    at internalNext (C:\Users\Seth\Projects\website\node_modules\helmet\index.js:51:7)

I should also specify, I know it's not an issue with my code itself as I created a test script that uses the exact same code and ran it with ts-node and it ran fine. Also also, the only extension requirement I have not yet met is the screenshot requirement and I don't see how a screenshot for an image attachment plugin would be of any use.

  • The resolution of the pictures shouldn't matter as they are actually optimized in the browser to a reasonable size.

  • The shortcode thing means its basically returning the base64 data of the image to you which is the default behavior if no image hosting service is setup.

The behavior seems to be related to that, if you post your code I can maybe eyeball it and see the problem. Are you using TypeScript?

I am using TypeScript. This is my code, minus comments. Although I tried to make everything neat enough that it probably shouldn't need comments.

import { addFilter, addCallback } from '@factor/api';
import { BlobServiceClient, ContainerClient } from '@azure/storage-blob';
import { PostAttachment } from '@factor/attachment';

const getService = (): {
  service: BlobServiceClient | undefined,
  containerName: string | undefined,
} => {
  const connectionString = process.env.FACTOR_AZURE_PLUGIN_CONNECTION_STRING;
  const containerName = process.env.FACTOR_AZURE_PLUGIN_CONTAINER_NAME;

  let service;

  if (connectionString)
    service = BlobServiceClient.fromConnectionString(connectionString);

  return { service, containerName };
};

const getContainer = async (): Promise<ContainerClient> => {
  return new Promise((resolve, reject) => {
    const { service, containerName } = getService();
    const container = service.getContainerClient(containerName);

    if (!container.exists())
      container.create().then((res) => { resolve(container) });
    else
      resolve(container);
  });
};

export const setup = (): void => {
  if (
    !process.env.FACTOR_AZURE_PLUGIN_CONTAINER_NAME ||
    !process.env.FACTOR_AZURE_PLUGIN_CONNECTION_STRING
  ) {
    addFilter({
      key: 'modernClassicAzureStorageSetup',
      hook: 'setup-needed',
      callback: (__: { title: string }[]) => {
        return [
          ...__,
          {
            title: 'Plugin: Azure Storage connection string',
            file: '.env',
            name: 'FACTOR_AZURE_PLUGIN_CONNECTION_STRING',
          },
        ];
      },
    });

    return;
  }

  addFilter({
    key: 'modernClassicAzureHandleUrl',
    hook: 'storage-attachment-url',
    priority: 200,
    callback: async ({
      buffer,
      key,
      mimetype,
    }: {
      buffer: Buffer,
      key: string,
      mimetype: string
    }) => {
      const { service, containerName } = getService();
      const container = await getContainer();
      if (!service || !containerName || !container) return;

      return new Promise((resolve, reject) => {
        const file = container.getBlockBlobClient(key);
        const length = Buffer.byteLength(buffer);

        file.upload(buffer, length, { 
          blobHTTPHeaders: { 
            blobContentType: mimetype,
          },
        })
        .then((res) => {
          if (!res.errorCode) resolve(file.url);
          else reject(res);
        })
        .catch((err) => {
          reject(err);
        });
      });
    },
  });

  addCallback({
    key: 'modernClassicAzureDeleteImage',
    hook: 'delete-attachment',
    callback: async (doc: PostAttachment) => {
      const { service, containerName } = getService();
      const container = await getContainer();
      if (!service || !containerName || !container) return;

      const key = doc.url.split(`${containerName}/`)[1];
      const file = container.getBlockBlobClient(key);

      return new Promise((resolve, reject) => {
        file.delete().then((res) => {
          if (!res.errorCode) resolve();
          else reject(res);
        })
        .catch((err) => {
          reject(err);
        });
      });
    },
  });
};

setup();

I think your problem is due to your returning promises instead of the result of the promises.

I highly recommend just switching to await since you're already using async. Lots easier to reason about!

Screen Shot 2020-05-28 at 9 44 37 PM

Also, pro tip... i realized i left this out of the docs but you can add the Node inspector if you pass a --inspect flag when starting Factor. Can help with debugging this type of thing.

That --inspect flag is actually good to know about. I guess I never actually thought to run yarn factor help to see if Factor had any kind of integrated debugger. I actually originally had used await instead of Promises, since that's how the Google Cloud plugin handles things, but when I was having issues I had decided to switch over to Promises like the S3 plugin.