awslabs / aws-js-s3-explorer

AWS JavaScript S3 Explorer is a JavaScript application that uses AWS's JavaScript SDK and S3 APIs to make the contents of an S3 bucket easy to browse via a web browser.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

copy/paste problem for non Latin characters named object

kayaademogullari opened this issue · comments

Hi. I changed s3.makeUnauthenticatedRequest to s3.makeAuthenticatedRequest and so, my clients may enters to their own non-website private buckets, upload / download file, add dir, rename, copy/cut/paste files easily via browser. But non-Latin character named objects cannot be able to copy and paste. This is the sample error in console;

TypeError: Cannot convert string to ByteString because the character at index 23 has value 287 which is greater than 255.

My code is quite simple;

 var s3 = new AWS.S3();
var params = {
    Bucket: 'myBucket',
    CopySource: 'myBucket' + '/folder_a/' + utf-8-file,
    Key: 'folder_b/' + utf-8-file,
    StorageClass: 'STANDARD_IA'
 };
s3.copyObject(params, function (err, data) {
     if (err)console.log(err);
 });

I can change file names with Latin characters easily to Latin/non-Latin chars via copyObject property but non-Latin chars named object cannot be changed at all. What can be done?

Thanks for this @kayaademogullari. We're looking into this!

commented

Hi @kayaademogullari, please you could document the steps leading to this error, include the non-Latin names, and indicate exactly which code is generating the TypeError. Thanks.

I placed checkbox into renderObject function,

function renderObject(data, type, full) {
       /* if (isthisdocument(s3exp_config.Bucket, data)) {
            console.log("is this document: " + data);
            return fullpath2filename(data);
        } else */if (isfolder(data)) {
            console.log("is folder: " + data);
            return '<div class="block" id="' + data + '"><div class="img dir">&nbsp;</div><a data-s3="folder" data-prefix="' + data + '" href="' + object2hrefvirt(s3exp_config.Bucket, data) + '">' + prefix2folder(data) + '</a><p><input type="checkbox" class="messageCheckbox" name="' + data + '"id="' + data + '"value="' + data + '"></div>';

        } else {
            console.log("not folder/this document: " + data);
            return '<div class="block" id="' + data + '"><div class="img ' + uzant(data) + '">&nbsp;</div><a data-s3="object" href="' + object2hrefvirt(s3exp_config.Bucket, data) + '">' + fullpath2filename(data) + '</a><p><input type="checkbox" class="messageCheckbox" name="' + data + '"id="' + data + '"value="' + data + '"></div>';
        }
    }

Here uzant(data) is a function that get extension file name to affect block and add file or folder icon for Objects. And below I've defined a function for choosing Object with checkbox. When an Object is checked, we get its name and if we press the rename button, calls bootbox, trigger jQuery.rename function and asks for new name. If object name is Latin, we can add a new ANY name without problem, else copyObject property doesnt work and console gives us error, for example;

TypeError: Cannot convert string to ByteString because the character at index 23 has value 287 which is greater than 255.

getChecked = function () {
    var result = $('input[type="checkbox"]:checked');
    if (result.length > 0) {
        var choosenObj = "";
        result.each(function () {
            choosenObj = $(this).val();
            //filenameUnderFolder is a function that splits the path
            var utf_8_file = filenameUnderFolder(choosenObj); 
        });
        console.log(choosenObj);
        jQuery.rename = function () {
            bootbox.prompt({
                title: "Enter a name for your file",
                value: utf_8_file,
                callback: function (result) {
                    if (result === null) {
                        //nothing happens
                    } else {
                        var s3 = new AWS.S3();
                        var params = {
                            Bucket: 'myBucket',
                            CopySource: 'myBucket' + '/folder_a/' + utf_8_file,
                            Key: 'folder_b/' + result,
                            StorageClass: 'STANDARD_IA'
                        };
                        s3.copyObject(params, function (err, data) {
                            if (err) console.log(err);
                        });
                    }
                }
            });
        };
    } else {
        console.log('you choosed nothing.');
    }
};

The getChecked function above included copy, paste, move, delete, getpresignedUrl functions in jQuery as well like rename, and works clearly. Of course, copy, paste and move functions have same non-Latin name problem too.
By the way, I've convert s3.makeUnauthenticatedRequest to s3.makeRequest, sorry for mistake. After that, I can manipulate all of this via a new CognitoIdentityCredentials.

commented

Hi, I was able to copy an object to another object with a non-Latin filename as follows:

var AWS = require("aws-sdk");

var s3 = new AWS.S3();

var params = {
    Bucket: 'mybucket',
    CopySource: 'myfolder/myfile.txt',
    Key: 'myfolder/myfile\u0287.txt'
};

s3.copyObject(params, function (err, data) {
    if (err) console.log(err);
});

Thank you very much.

Hi, I've solved the problem, changed CopySource parameter with encodeURI(file_name) property, like this;

                        var s3 = new AWS.S3();
                        var params = {
                            Bucket: 'myBucket',
                            CopySource: 'myBucket' + '/folder_a/' + encodeURI(utf_8_file),
                            Key: 'folder_b/' + result,
                            StorageClass: 'STANDARD_IA'
                        };
                        s3.copyObject(params, function (err, data) {
                            if (err) console.log(err);
                        });

thats all...

commented

Good news, thanks for the update.