Copying files in fileshare with Azure Data Factory configuration problem












1















I am trying to learn using the Azure Data Factory to copy data (a collection of csv files in a folder structure) from an Azure File Share to a Cosmos DB instance.



In Azure Data factory I'm creating a "copy data" activity and try to set my file share as source using the following host:



mystorageaccount.file.core.windows.net\mystoragefilesharename



When trying to test the connection, I get the following error:



[{"code":9059,"message":"File path 'E:\approot\mscissstorage.file.core.windows.net\mystoragefilesharename' is not supported. Check the configuration to make sure the path is valid."}]



Should I move the data to another storage type like a blob or I am not entering the correct host url?










share|improve this question



























    1















    I am trying to learn using the Azure Data Factory to copy data (a collection of csv files in a folder structure) from an Azure File Share to a Cosmos DB instance.



    In Azure Data factory I'm creating a "copy data" activity and try to set my file share as source using the following host:



    mystorageaccount.file.core.windows.net\mystoragefilesharename



    When trying to test the connection, I get the following error:



    [{"code":9059,"message":"File path 'E:\approot\mscissstorage.file.core.windows.net\mystoragefilesharename' is not supported. Check the configuration to make sure the path is valid."}]



    Should I move the data to another storage type like a blob or I am not entering the correct host url?










    share|improve this question

























      1












      1








      1








      I am trying to learn using the Azure Data Factory to copy data (a collection of csv files in a folder structure) from an Azure File Share to a Cosmos DB instance.



      In Azure Data factory I'm creating a "copy data" activity and try to set my file share as source using the following host:



      mystorageaccount.file.core.windows.net\mystoragefilesharename



      When trying to test the connection, I get the following error:



      [{"code":9059,"message":"File path 'E:\approot\mscissstorage.file.core.windows.net\mystoragefilesharename' is not supported. Check the configuration to make sure the path is valid."}]



      Should I move the data to another storage type like a blob or I am not entering the correct host url?










      share|improve this question














      I am trying to learn using the Azure Data Factory to copy data (a collection of csv files in a folder structure) from an Azure File Share to a Cosmos DB instance.



      In Azure Data factory I'm creating a "copy data" activity and try to set my file share as source using the following host:



      mystorageaccount.file.core.windows.net\mystoragefilesharename



      When trying to test the connection, I get the following error:



      [{"code":9059,"message":"File path 'E:\approot\mscissstorage.file.core.windows.net\mystoragefilesharename' is not supported. Check the configuration to make sure the path is valid."}]



      Should I move the data to another storage type like a blob or I am not entering the correct host url?







      azure azure-storage azure-data-factory






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 14 '18 at 12:26









      orestisforestisf

      2717




      2717
























          3 Answers
          3






          active

          oldest

          votes


















          2














          You'll need to specify the host in json file like this "\myservershare" if you create pipeline with JSON directly or you use set the host url like this "myservershare" if you're using UI to setup pipeline.



          Here is more info:
          https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions






          share|improve this answer
























          • Thanks for pointing me to the right direction - I've figured it out and posting a reply.

            – orestisf
            Nov 16 '18 at 22:33



















          1














          I believe when you create file linked service, you might choose public IR. If you choose public IR, local path (e.g c:xxx, D:xxx) is not allowed, because the machine that run your job is managed by us, which not contains any customer data. Please use self-hosted IR to copy your local files.






          share|improve this answer
























          • No, I was trying to use an Azure file share as a dataset source, not a local one...

            – orestisf
            Nov 16 '18 at 22:34



















          0














          Based on the link posted by Nicolas Zhang: https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions and the examples provided therein, I was able to solve it an successfully create the copy action. I had two errors (I'm configuring via the data factory UI and not directly the JSON):




          1. In the host path, the correct one should be: \mystorageaccount.file.core.windows.netmystoragefilesharenamemyfolderpath

          2. The username and password must be the one corresponding to the storage account and not to the actual user's account which I was erroneously using.






          share|improve this answer























            Your Answer






            StackExchange.ifUsing("editor", function () {
            StackExchange.using("externalEditor", function () {
            StackExchange.using("snippets", function () {
            StackExchange.snippets.init();
            });
            });
            }, "code-snippets");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "1"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53300221%2fcopying-files-in-fileshare-with-azure-data-factory-configuration-problem%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            3 Answers
            3






            active

            oldest

            votes








            3 Answers
            3






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            2














            You'll need to specify the host in json file like this "\myservershare" if you create pipeline with JSON directly or you use set the host url like this "myservershare" if you're using UI to setup pipeline.



            Here is more info:
            https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions






            share|improve this answer
























            • Thanks for pointing me to the right direction - I've figured it out and posting a reply.

              – orestisf
              Nov 16 '18 at 22:33
















            2














            You'll need to specify the host in json file like this "\myservershare" if you create pipeline with JSON directly or you use set the host url like this "myservershare" if you're using UI to setup pipeline.



            Here is more info:
            https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions






            share|improve this answer
























            • Thanks for pointing me to the right direction - I've figured it out and posting a reply.

              – orestisf
              Nov 16 '18 at 22:33














            2












            2








            2







            You'll need to specify the host in json file like this "\myservershare" if you create pipeline with JSON directly or you use set the host url like this "myservershare" if you're using UI to setup pipeline.



            Here is more info:
            https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions






            share|improve this answer













            You'll need to specify the host in json file like this "\myservershare" if you create pipeline with JSON directly or you use set the host url like this "myservershare" if you're using UI to setup pipeline.



            Here is more info:
            https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Nov 14 '18 at 14:40









            Nicolas ZhangNicolas Zhang

            561




            561













            • Thanks for pointing me to the right direction - I've figured it out and posting a reply.

              – orestisf
              Nov 16 '18 at 22:33



















            • Thanks for pointing me to the right direction - I've figured it out and posting a reply.

              – orestisf
              Nov 16 '18 at 22:33

















            Thanks for pointing me to the right direction - I've figured it out and posting a reply.

            – orestisf
            Nov 16 '18 at 22:33





            Thanks for pointing me to the right direction - I've figured it out and posting a reply.

            – orestisf
            Nov 16 '18 at 22:33













            1














            I believe when you create file linked service, you might choose public IR. If you choose public IR, local path (e.g c:xxx, D:xxx) is not allowed, because the machine that run your job is managed by us, which not contains any customer data. Please use self-hosted IR to copy your local files.






            share|improve this answer
























            • No, I was trying to use an Azure file share as a dataset source, not a local one...

              – orestisf
              Nov 16 '18 at 22:34
















            1














            I believe when you create file linked service, you might choose public IR. If you choose public IR, local path (e.g c:xxx, D:xxx) is not allowed, because the machine that run your job is managed by us, which not contains any customer data. Please use self-hosted IR to copy your local files.






            share|improve this answer
























            • No, I was trying to use an Azure file share as a dataset source, not a local one...

              – orestisf
              Nov 16 '18 at 22:34














            1












            1








            1







            I believe when you create file linked service, you might choose public IR. If you choose public IR, local path (e.g c:xxx, D:xxx) is not allowed, because the machine that run your job is managed by us, which not contains any customer data. Please use self-hosted IR to copy your local files.






            share|improve this answer













            I believe when you create file linked service, you might choose public IR. If you choose public IR, local path (e.g c:xxx, D:xxx) is not allowed, because the machine that run your job is managed by us, which not contains any customer data. Please use self-hosted IR to copy your local files.







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Nov 15 '18 at 1:58









            pinye.lipinye.li

            111




            111













            • No, I was trying to use an Azure file share as a dataset source, not a local one...

              – orestisf
              Nov 16 '18 at 22:34



















            • No, I was trying to use an Azure file share as a dataset source, not a local one...

              – orestisf
              Nov 16 '18 at 22:34

















            No, I was trying to use an Azure file share as a dataset source, not a local one...

            – orestisf
            Nov 16 '18 at 22:34





            No, I was trying to use an Azure file share as a dataset source, not a local one...

            – orestisf
            Nov 16 '18 at 22:34











            0














            Based on the link posted by Nicolas Zhang: https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions and the examples provided therein, I was able to solve it an successfully create the copy action. I had two errors (I'm configuring via the data factory UI and not directly the JSON):




            1. In the host path, the correct one should be: \mystorageaccount.file.core.windows.netmystoragefilesharenamemyfolderpath

            2. The username and password must be the one corresponding to the storage account and not to the actual user's account which I was erroneously using.






            share|improve this answer




























              0














              Based on the link posted by Nicolas Zhang: https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions and the examples provided therein, I was able to solve it an successfully create the copy action. I had two errors (I'm configuring via the data factory UI and not directly the JSON):




              1. In the host path, the correct one should be: \mystorageaccount.file.core.windows.netmystoragefilesharenamemyfolderpath

              2. The username and password must be the one corresponding to the storage account and not to the actual user's account which I was erroneously using.






              share|improve this answer


























                0












                0








                0







                Based on the link posted by Nicolas Zhang: https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions and the examples provided therein, I was able to solve it an successfully create the copy action. I had two errors (I'm configuring via the data factory UI and not directly the JSON):




                1. In the host path, the correct one should be: \mystorageaccount.file.core.windows.netmystoragefilesharenamemyfolderpath

                2. The username and password must be the one corresponding to the storage account and not to the actual user's account which I was erroneously using.






                share|improve this answer













                Based on the link posted by Nicolas Zhang: https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions and the examples provided therein, I was able to solve it an successfully create the copy action. I had two errors (I'm configuring via the data factory UI and not directly the JSON):




                1. In the host path, the correct one should be: \mystorageaccount.file.core.windows.netmystoragefilesharenamemyfolderpath

                2. The username and password must be the one corresponding to the storage account and not to the actual user's account which I was erroneously using.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Nov 16 '18 at 22:39









                orestisforestisf

                2717




                2717






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53300221%2fcopying-files-in-fileshare-with-azure-data-factory-configuration-problem%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Bressuire

                    Vorschmack

                    Quarantine