Different versions of CVXPY generates different results.





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







0















Due to the lack of knowledge of CVXPY, I have a problem with reconciling a simple optimization problem's results when using different versions of it.



When I use CVXPY with version 0.4.5 I wrote my problem as:



import numpy as np
from cvxpy import *

n = 5
np.random.seed(123)
g1 = np.random.rand(2*n, 1)
H1 = np.eye(2*n)
w = Variable(2*n)
gamma = Parameter(sign="positive")
ret = -g1.T*w
risk = quad_form(w, H1)
prob = Problem(Maximize(ret - gamma*risk),
[w >= 0])
gamma.value = 0.5
prob.solve()
res = w.value


and the res equals to:



 res = [[  2.86653834e-12],
[ 2.47912037e-11],
[ 3.73027873e-11],
[ 7.13532730e-12],
[ 2.31133274e-12],
[ 1.27710498e-11],
[ -2.50944234e-12],
[ 3.15803733e-12],
[ 9.90353521e-12],
[ 1.46452182e-11]]


However, When I use CVXPY with version 1.0.8, I type almost the same codes as follows:



n = 5 
np.random.seed(123)
g1 = np.random.rand(2*n, 1)
H1 = np.eye(2*n)
w = Variable(2*n)
gamma = Parameter(nonneg=True)
ret = -g1.T*w
risk = quad_form(w, H1)
prob = Problem(Maximize(ret - gamma*risk),
[w >= 0])
gamma.value = 0.5
prob.solve()
res = w.value


The result is:



(Pdb) res
array([6.66098380e-25, 2.73633363e-25, 2.16955532e-25, 5.27275998e-25,
6.88070573e-25, 4.04646723e-25, 9.37904145e-25, 6.54954091e-25,
4.60002892e-25, 3.75018828e-25])


The only difference I made when using version 1.0.8 of CVXPY is that I use attribute 'nonneg=True' instead of 'sign=positive' which I think they are essentially the same thing. Can someone help me out here? What are the possible reasons that the results are so different?



Many thanks










share|improve this question























  • The results look approximately the same to me: 0 for all w's. The difference is noise.

    – Erwin Kalvelagen
    Nov 16 '18 at 21:41











  • It's possible the difference is noise, since if I flip the sign of g1 then the results become the same. But what is weird is the noise shouldn't be this large. They are certainly not on the same scale. And if I tested other cases, the difference will be larger to the extent affecting the decision making of the application.

    – user45668
    Nov 16 '18 at 22:29













  • If this is the noise I guess my question is when can we trust the results in similar cases since those noises can give us pretty different results.

    – user45668
    Nov 16 '18 at 22:59











  • Note that interior point solvers require that variables stay strictly positive when we have a bound x>=0. How close they will get to the bound depend on many things. You can solve with the verbose option to see the solver logs. There may be some indications what is different. But again, for interior point solvers it is normal that variables are somewhat inside their bounds.

    – Erwin Kalvelagen
    Nov 17 '18 at 10:40


















0















Due to the lack of knowledge of CVXPY, I have a problem with reconciling a simple optimization problem's results when using different versions of it.



When I use CVXPY with version 0.4.5 I wrote my problem as:



import numpy as np
from cvxpy import *

n = 5
np.random.seed(123)
g1 = np.random.rand(2*n, 1)
H1 = np.eye(2*n)
w = Variable(2*n)
gamma = Parameter(sign="positive")
ret = -g1.T*w
risk = quad_form(w, H1)
prob = Problem(Maximize(ret - gamma*risk),
[w >= 0])
gamma.value = 0.5
prob.solve()
res = w.value


and the res equals to:



 res = [[  2.86653834e-12],
[ 2.47912037e-11],
[ 3.73027873e-11],
[ 7.13532730e-12],
[ 2.31133274e-12],
[ 1.27710498e-11],
[ -2.50944234e-12],
[ 3.15803733e-12],
[ 9.90353521e-12],
[ 1.46452182e-11]]


However, When I use CVXPY with version 1.0.8, I type almost the same codes as follows:



n = 5 
np.random.seed(123)
g1 = np.random.rand(2*n, 1)
H1 = np.eye(2*n)
w = Variable(2*n)
gamma = Parameter(nonneg=True)
ret = -g1.T*w
risk = quad_form(w, H1)
prob = Problem(Maximize(ret - gamma*risk),
[w >= 0])
gamma.value = 0.5
prob.solve()
res = w.value


The result is:



(Pdb) res
array([6.66098380e-25, 2.73633363e-25, 2.16955532e-25, 5.27275998e-25,
6.88070573e-25, 4.04646723e-25, 9.37904145e-25, 6.54954091e-25,
4.60002892e-25, 3.75018828e-25])


The only difference I made when using version 1.0.8 of CVXPY is that I use attribute 'nonneg=True' instead of 'sign=positive' which I think they are essentially the same thing. Can someone help me out here? What are the possible reasons that the results are so different?



Many thanks










share|improve this question























  • The results look approximately the same to me: 0 for all w's. The difference is noise.

    – Erwin Kalvelagen
    Nov 16 '18 at 21:41











  • It's possible the difference is noise, since if I flip the sign of g1 then the results become the same. But what is weird is the noise shouldn't be this large. They are certainly not on the same scale. And if I tested other cases, the difference will be larger to the extent affecting the decision making of the application.

    – user45668
    Nov 16 '18 at 22:29













  • If this is the noise I guess my question is when can we trust the results in similar cases since those noises can give us pretty different results.

    – user45668
    Nov 16 '18 at 22:59











  • Note that interior point solvers require that variables stay strictly positive when we have a bound x>=0. How close they will get to the bound depend on many things. You can solve with the verbose option to see the solver logs. There may be some indications what is different. But again, for interior point solvers it is normal that variables are somewhat inside their bounds.

    – Erwin Kalvelagen
    Nov 17 '18 at 10:40














0












0








0








Due to the lack of knowledge of CVXPY, I have a problem with reconciling a simple optimization problem's results when using different versions of it.



When I use CVXPY with version 0.4.5 I wrote my problem as:



import numpy as np
from cvxpy import *

n = 5
np.random.seed(123)
g1 = np.random.rand(2*n, 1)
H1 = np.eye(2*n)
w = Variable(2*n)
gamma = Parameter(sign="positive")
ret = -g1.T*w
risk = quad_form(w, H1)
prob = Problem(Maximize(ret - gamma*risk),
[w >= 0])
gamma.value = 0.5
prob.solve()
res = w.value


and the res equals to:



 res = [[  2.86653834e-12],
[ 2.47912037e-11],
[ 3.73027873e-11],
[ 7.13532730e-12],
[ 2.31133274e-12],
[ 1.27710498e-11],
[ -2.50944234e-12],
[ 3.15803733e-12],
[ 9.90353521e-12],
[ 1.46452182e-11]]


However, When I use CVXPY with version 1.0.8, I type almost the same codes as follows:



n = 5 
np.random.seed(123)
g1 = np.random.rand(2*n, 1)
H1 = np.eye(2*n)
w = Variable(2*n)
gamma = Parameter(nonneg=True)
ret = -g1.T*w
risk = quad_form(w, H1)
prob = Problem(Maximize(ret - gamma*risk),
[w >= 0])
gamma.value = 0.5
prob.solve()
res = w.value


The result is:



(Pdb) res
array([6.66098380e-25, 2.73633363e-25, 2.16955532e-25, 5.27275998e-25,
6.88070573e-25, 4.04646723e-25, 9.37904145e-25, 6.54954091e-25,
4.60002892e-25, 3.75018828e-25])


The only difference I made when using version 1.0.8 of CVXPY is that I use attribute 'nonneg=True' instead of 'sign=positive' which I think they are essentially the same thing. Can someone help me out here? What are the possible reasons that the results are so different?



Many thanks










share|improve this question














Due to the lack of knowledge of CVXPY, I have a problem with reconciling a simple optimization problem's results when using different versions of it.



When I use CVXPY with version 0.4.5 I wrote my problem as:



import numpy as np
from cvxpy import *

n = 5
np.random.seed(123)
g1 = np.random.rand(2*n, 1)
H1 = np.eye(2*n)
w = Variable(2*n)
gamma = Parameter(sign="positive")
ret = -g1.T*w
risk = quad_form(w, H1)
prob = Problem(Maximize(ret - gamma*risk),
[w >= 0])
gamma.value = 0.5
prob.solve()
res = w.value


and the res equals to:



 res = [[  2.86653834e-12],
[ 2.47912037e-11],
[ 3.73027873e-11],
[ 7.13532730e-12],
[ 2.31133274e-12],
[ 1.27710498e-11],
[ -2.50944234e-12],
[ 3.15803733e-12],
[ 9.90353521e-12],
[ 1.46452182e-11]]


However, When I use CVXPY with version 1.0.8, I type almost the same codes as follows:



n = 5 
np.random.seed(123)
g1 = np.random.rand(2*n, 1)
H1 = np.eye(2*n)
w = Variable(2*n)
gamma = Parameter(nonneg=True)
ret = -g1.T*w
risk = quad_form(w, H1)
prob = Problem(Maximize(ret - gamma*risk),
[w >= 0])
gamma.value = 0.5
prob.solve()
res = w.value


The result is:



(Pdb) res
array([6.66098380e-25, 2.73633363e-25, 2.16955532e-25, 5.27275998e-25,
6.88070573e-25, 4.04646723e-25, 9.37904145e-25, 6.54954091e-25,
4.60002892e-25, 3.75018828e-25])


The only difference I made when using version 1.0.8 of CVXPY is that I use attribute 'nonneg=True' instead of 'sign=positive' which I think they are essentially the same thing. Can someone help me out here? What are the possible reasons that the results are so different?



Many thanks







quadratic-programming cvxpy






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 16 '18 at 20:35









user45668user45668

113




113













  • The results look approximately the same to me: 0 for all w's. The difference is noise.

    – Erwin Kalvelagen
    Nov 16 '18 at 21:41











  • It's possible the difference is noise, since if I flip the sign of g1 then the results become the same. But what is weird is the noise shouldn't be this large. They are certainly not on the same scale. And if I tested other cases, the difference will be larger to the extent affecting the decision making of the application.

    – user45668
    Nov 16 '18 at 22:29













  • If this is the noise I guess my question is when can we trust the results in similar cases since those noises can give us pretty different results.

    – user45668
    Nov 16 '18 at 22:59











  • Note that interior point solvers require that variables stay strictly positive when we have a bound x>=0. How close they will get to the bound depend on many things. You can solve with the verbose option to see the solver logs. There may be some indications what is different. But again, for interior point solvers it is normal that variables are somewhat inside their bounds.

    – Erwin Kalvelagen
    Nov 17 '18 at 10:40



















  • The results look approximately the same to me: 0 for all w's. The difference is noise.

    – Erwin Kalvelagen
    Nov 16 '18 at 21:41











  • It's possible the difference is noise, since if I flip the sign of g1 then the results become the same. But what is weird is the noise shouldn't be this large. They are certainly not on the same scale. And if I tested other cases, the difference will be larger to the extent affecting the decision making of the application.

    – user45668
    Nov 16 '18 at 22:29













  • If this is the noise I guess my question is when can we trust the results in similar cases since those noises can give us pretty different results.

    – user45668
    Nov 16 '18 at 22:59











  • Note that interior point solvers require that variables stay strictly positive when we have a bound x>=0. How close they will get to the bound depend on many things. You can solve with the verbose option to see the solver logs. There may be some indications what is different. But again, for interior point solvers it is normal that variables are somewhat inside their bounds.

    – Erwin Kalvelagen
    Nov 17 '18 at 10:40

















The results look approximately the same to me: 0 for all w's. The difference is noise.

– Erwin Kalvelagen
Nov 16 '18 at 21:41





The results look approximately the same to me: 0 for all w's. The difference is noise.

– Erwin Kalvelagen
Nov 16 '18 at 21:41













It's possible the difference is noise, since if I flip the sign of g1 then the results become the same. But what is weird is the noise shouldn't be this large. They are certainly not on the same scale. And if I tested other cases, the difference will be larger to the extent affecting the decision making of the application.

– user45668
Nov 16 '18 at 22:29







It's possible the difference is noise, since if I flip the sign of g1 then the results become the same. But what is weird is the noise shouldn't be this large. They are certainly not on the same scale. And if I tested other cases, the difference will be larger to the extent affecting the decision making of the application.

– user45668
Nov 16 '18 at 22:29















If this is the noise I guess my question is when can we trust the results in similar cases since those noises can give us pretty different results.

– user45668
Nov 16 '18 at 22:59





If this is the noise I guess my question is when can we trust the results in similar cases since those noises can give us pretty different results.

– user45668
Nov 16 '18 at 22:59













Note that interior point solvers require that variables stay strictly positive when we have a bound x>=0. How close they will get to the bound depend on many things. You can solve with the verbose option to see the solver logs. There may be some indications what is different. But again, for interior point solvers it is normal that variables are somewhat inside their bounds.

– Erwin Kalvelagen
Nov 17 '18 at 10:40





Note that interior point solvers require that variables stay strictly positive when we have a bound x>=0. How close they will get to the bound depend on many things. You can solve with the verbose option to see the solver logs. There may be some indications what is different. But again, for interior point solvers it is normal that variables are somewhat inside their bounds.

– Erwin Kalvelagen
Nov 17 '18 at 10:40












1 Answer
1






active

oldest

votes


















1














CVXPY 1.0 uses the OSQP solver for problems like yours, whereas CVXPY 0.4 uses ECOS. That's why the results differ. But ultimately numbers very close to zero should be treated as zero. If your program behaves differently if the output is -1e-12 versus 1e-12 you may want to make the program less sensitive.






share|improve this answer
























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53345020%2fdifferent-versions-of-cvxpy-generates-different-results%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1














    CVXPY 1.0 uses the OSQP solver for problems like yours, whereas CVXPY 0.4 uses ECOS. That's why the results differ. But ultimately numbers very close to zero should be treated as zero. If your program behaves differently if the output is -1e-12 versus 1e-12 you may want to make the program less sensitive.






    share|improve this answer




























      1














      CVXPY 1.0 uses the OSQP solver for problems like yours, whereas CVXPY 0.4 uses ECOS. That's why the results differ. But ultimately numbers very close to zero should be treated as zero. If your program behaves differently if the output is -1e-12 versus 1e-12 you may want to make the program less sensitive.






      share|improve this answer


























        1












        1








        1







        CVXPY 1.0 uses the OSQP solver for problems like yours, whereas CVXPY 0.4 uses ECOS. That's why the results differ. But ultimately numbers very close to zero should be treated as zero. If your program behaves differently if the output is -1e-12 versus 1e-12 you may want to make the program less sensitive.






        share|improve this answer













        CVXPY 1.0 uses the OSQP solver for problems like yours, whereas CVXPY 0.4 uses ECOS. That's why the results differ. But ultimately numbers very close to zero should be treated as zero. If your program behaves differently if the output is -1e-12 versus 1e-12 you may want to make the program less sensitive.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 21 '18 at 2:38









        stevensteven

        45625




        45625
































            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53345020%2fdifferent-versions-of-cvxpy-generates-different-results%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Xamarin.iOS Cant Deploy on Iphone

            Glorious Revolution

            Dulmage-Mendelsohn matrix decomposition in Python