SQL LIKE in Spark SQL











up vote
6
down vote

favorite












I'm trying to implement a join in Spark SQL using a LIKE condition.



The row I am performing the join on looks like this and is called 'revision':



Table A:



8NXDPVAE


Table B:



[4,8]NXD_V%


Performing the join on SQL server (A.revision LIKE B.revision) works just fine, but when doing the same in Spark SQL, the join returns no rows (if using inner join) or null values for Table B (if using outer join).



This is the query I am running:



val joined = spark.sql("SELECT A.revision, B.revision FROM RAWDATA A LEFT JOIN TPTYPE B ON A.revision LIKE B.revision")


The plan looks like this:



== Physical Plan ==
BroadcastNestedLoopJoin BuildLeft, LeftOuter, revision#15 LIKE revision#282, false
:- BroadcastExchange IdentityBroadcastMode
: +- *Project [revision#15]
: +- *Scan JDBCRelation(RAWDATA) [revision#15] PushedFilters: [EqualTo(bulk_id,2016092419270100198)], ReadSchema: struct<revision>
+- *Scan JDBCRelation(TPTYPE) [revision#282] ReadSchema: struct<revision>


Is it possible to perform a LIKE join like this or am I way off?










share|improve this question




























    up vote
    6
    down vote

    favorite












    I'm trying to implement a join in Spark SQL using a LIKE condition.



    The row I am performing the join on looks like this and is called 'revision':



    Table A:



    8NXDPVAE


    Table B:



    [4,8]NXD_V%


    Performing the join on SQL server (A.revision LIKE B.revision) works just fine, but when doing the same in Spark SQL, the join returns no rows (if using inner join) or null values for Table B (if using outer join).



    This is the query I am running:



    val joined = spark.sql("SELECT A.revision, B.revision FROM RAWDATA A LEFT JOIN TPTYPE B ON A.revision LIKE B.revision")


    The plan looks like this:



    == Physical Plan ==
    BroadcastNestedLoopJoin BuildLeft, LeftOuter, revision#15 LIKE revision#282, false
    :- BroadcastExchange IdentityBroadcastMode
    : +- *Project [revision#15]
    : +- *Scan JDBCRelation(RAWDATA) [revision#15] PushedFilters: [EqualTo(bulk_id,2016092419270100198)], ReadSchema: struct<revision>
    +- *Scan JDBCRelation(TPTYPE) [revision#282] ReadSchema: struct<revision>


    Is it possible to perform a LIKE join like this or am I way off?










    share|improve this question


























      up vote
      6
      down vote

      favorite









      up vote
      6
      down vote

      favorite











      I'm trying to implement a join in Spark SQL using a LIKE condition.



      The row I am performing the join on looks like this and is called 'revision':



      Table A:



      8NXDPVAE


      Table B:



      [4,8]NXD_V%


      Performing the join on SQL server (A.revision LIKE B.revision) works just fine, but when doing the same in Spark SQL, the join returns no rows (if using inner join) or null values for Table B (if using outer join).



      This is the query I am running:



      val joined = spark.sql("SELECT A.revision, B.revision FROM RAWDATA A LEFT JOIN TPTYPE B ON A.revision LIKE B.revision")


      The plan looks like this:



      == Physical Plan ==
      BroadcastNestedLoopJoin BuildLeft, LeftOuter, revision#15 LIKE revision#282, false
      :- BroadcastExchange IdentityBroadcastMode
      : +- *Project [revision#15]
      : +- *Scan JDBCRelation(RAWDATA) [revision#15] PushedFilters: [EqualTo(bulk_id,2016092419270100198)], ReadSchema: struct<revision>
      +- *Scan JDBCRelation(TPTYPE) [revision#282] ReadSchema: struct<revision>


      Is it possible to perform a LIKE join like this or am I way off?










      share|improve this question















      I'm trying to implement a join in Spark SQL using a LIKE condition.



      The row I am performing the join on looks like this and is called 'revision':



      Table A:



      8NXDPVAE


      Table B:



      [4,8]NXD_V%


      Performing the join on SQL server (A.revision LIKE B.revision) works just fine, but when doing the same in Spark SQL, the join returns no rows (if using inner join) or null values for Table B (if using outer join).



      This is the query I am running:



      val joined = spark.sql("SELECT A.revision, B.revision FROM RAWDATA A LEFT JOIN TPTYPE B ON A.revision LIKE B.revision")


      The plan looks like this:



      == Physical Plan ==
      BroadcastNestedLoopJoin BuildLeft, LeftOuter, revision#15 LIKE revision#282, false
      :- BroadcastExchange IdentityBroadcastMode
      : +- *Project [revision#15]
      : +- *Scan JDBCRelation(RAWDATA) [revision#15] PushedFilters: [EqualTo(bulk_id,2016092419270100198)], ReadSchema: struct<revision>
      +- *Scan JDBCRelation(TPTYPE) [revision#282] ReadSchema: struct<revision>


      Is it possible to perform a LIKE join like this or am I way off?







      sql regex apache-spark apache-spark-sql






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 12 at 13:49









      Community

      11




      11










      asked Nov 6 '16 at 20:38









      Dan Markhasin

      2181413




      2181413
























          1 Answer
          1






          active

          oldest

          votes

















          up vote
          7
          down vote



          accepted










          You are only a little bit off. Spark SQL and Hive follow SQL standard conventions where LIKE operator accepts only two special characters:





          • _ (underscore) - which matches an arbitrary character.


          • % (percent) - which matches an arbitrary sequence of characters.


          Square brackets have no special meaning and [4,8] matches only a [4,8] literal:



          spark.sql("SELECT '[4,8]' LIKE '[4,8]'").show


          +----------------+
          |[4,8] LIKE [4,8]|
          +----------------+
          | true|
          +----------------+


          To match complex patterns you can use RLIKE operator which suports Java regular expressions:



          spark.sql("SELECT '8NXDPVAE' RLIKE '^[4,8]NXD.V.*$'").show


          +-----------------------------+
          |8NXDPVAE RLIKE ^[4,8]NXD.V.*$|
          +-----------------------------+
          | true|
          +-----------------------------+





          share|improve this answer





















            Your Answer






            StackExchange.ifUsing("editor", function () {
            StackExchange.using("externalEditor", function () {
            StackExchange.using("snippets", function () {
            StackExchange.snippets.init();
            });
            });
            }, "code-snippets");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "1"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f40454262%2fsql-like-in-spark-sql%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            7
            down vote



            accepted










            You are only a little bit off. Spark SQL and Hive follow SQL standard conventions where LIKE operator accepts only two special characters:





            • _ (underscore) - which matches an arbitrary character.


            • % (percent) - which matches an arbitrary sequence of characters.


            Square brackets have no special meaning and [4,8] matches only a [4,8] literal:



            spark.sql("SELECT '[4,8]' LIKE '[4,8]'").show


            +----------------+
            |[4,8] LIKE [4,8]|
            +----------------+
            | true|
            +----------------+


            To match complex patterns you can use RLIKE operator which suports Java regular expressions:



            spark.sql("SELECT '8NXDPVAE' RLIKE '^[4,8]NXD.V.*$'").show


            +-----------------------------+
            |8NXDPVAE RLIKE ^[4,8]NXD.V.*$|
            +-----------------------------+
            | true|
            +-----------------------------+





            share|improve this answer

























              up vote
              7
              down vote



              accepted










              You are only a little bit off. Spark SQL and Hive follow SQL standard conventions where LIKE operator accepts only two special characters:





              • _ (underscore) - which matches an arbitrary character.


              • % (percent) - which matches an arbitrary sequence of characters.


              Square brackets have no special meaning and [4,8] matches only a [4,8] literal:



              spark.sql("SELECT '[4,8]' LIKE '[4,8]'").show


              +----------------+
              |[4,8] LIKE [4,8]|
              +----------------+
              | true|
              +----------------+


              To match complex patterns you can use RLIKE operator which suports Java regular expressions:



              spark.sql("SELECT '8NXDPVAE' RLIKE '^[4,8]NXD.V.*$'").show


              +-----------------------------+
              |8NXDPVAE RLIKE ^[4,8]NXD.V.*$|
              +-----------------------------+
              | true|
              +-----------------------------+





              share|improve this answer























                up vote
                7
                down vote



                accepted







                up vote
                7
                down vote



                accepted






                You are only a little bit off. Spark SQL and Hive follow SQL standard conventions where LIKE operator accepts only two special characters:





                • _ (underscore) - which matches an arbitrary character.


                • % (percent) - which matches an arbitrary sequence of characters.


                Square brackets have no special meaning and [4,8] matches only a [4,8] literal:



                spark.sql("SELECT '[4,8]' LIKE '[4,8]'").show


                +----------------+
                |[4,8] LIKE [4,8]|
                +----------------+
                | true|
                +----------------+


                To match complex patterns you can use RLIKE operator which suports Java regular expressions:



                spark.sql("SELECT '8NXDPVAE' RLIKE '^[4,8]NXD.V.*$'").show


                +-----------------------------+
                |8NXDPVAE RLIKE ^[4,8]NXD.V.*$|
                +-----------------------------+
                | true|
                +-----------------------------+





                share|improve this answer












                You are only a little bit off. Spark SQL and Hive follow SQL standard conventions where LIKE operator accepts only two special characters:





                • _ (underscore) - which matches an arbitrary character.


                • % (percent) - which matches an arbitrary sequence of characters.


                Square brackets have no special meaning and [4,8] matches only a [4,8] literal:



                spark.sql("SELECT '[4,8]' LIKE '[4,8]'").show


                +----------------+
                |[4,8] LIKE [4,8]|
                +----------------+
                | true|
                +----------------+


                To match complex patterns you can use RLIKE operator which suports Java regular expressions:



                spark.sql("SELECT '8NXDPVAE' RLIKE '^[4,8]NXD.V.*$'").show


                +-----------------------------+
                |8NXDPVAE RLIKE ^[4,8]NXD.V.*$|
                +-----------------------------+
                | true|
                +-----------------------------+






                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Nov 6 '16 at 21:25









                user6910411

                32.4k86793




                32.4k86793






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.





                    Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                    Please pay close attention to the following guidance:


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f40454262%2fsql-like-in-spark-sql%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Bressuire

                    Vorschmack

                    Quarantine