Airflow “This DAG isnt available in the webserver DagBag object ”











up vote
0
down vote

favorite












I am currently setup airflow scheduler in Linux server A and airflow web server in Linux server B. Both server has no Internet access. I have start the initDB in server A and keep all the dags in server A.



However, when i refresh the webserver UI, it keep having error message:-



This DAG isn't available in the webserver DagBag object


How do i configure the dag folder for web server (server B) to read the dag from scheduler (server A)?



I am using bashoperator. Is that Celery Operator is a must?



Thanks in advance










share|improve this question
























  • My set up slightly different. Both scheduler and webserver are in different server.
    – i2cute
    Nov 13 at 3:28












  • How exactly does that make product behaviour Different ?
    – sulabh chaturvedi
    Nov 13 at 3:33










  • Product behavior should be same as I thought. But webserver need to read the dags which stored in proxy (different server) and I having problems on this. How can I share the dags without copying over?
    – i2cute
    Nov 13 at 3:45










  • Possible Duplicate of - stackoverflow.com/questions/47834925/…
    – sulabh chaturvedi
    Nov 13 at 4:05















up vote
0
down vote

favorite












I am currently setup airflow scheduler in Linux server A and airflow web server in Linux server B. Both server has no Internet access. I have start the initDB in server A and keep all the dags in server A.



However, when i refresh the webserver UI, it keep having error message:-



This DAG isn't available in the webserver DagBag object


How do i configure the dag folder for web server (server B) to read the dag from scheduler (server A)?



I am using bashoperator. Is that Celery Operator is a must?



Thanks in advance










share|improve this question
























  • My set up slightly different. Both scheduler and webserver are in different server.
    – i2cute
    Nov 13 at 3:28












  • How exactly does that make product behaviour Different ?
    – sulabh chaturvedi
    Nov 13 at 3:33










  • Product behavior should be same as I thought. But webserver need to read the dags which stored in proxy (different server) and I having problems on this. How can I share the dags without copying over?
    – i2cute
    Nov 13 at 3:45










  • Possible Duplicate of - stackoverflow.com/questions/47834925/…
    – sulabh chaturvedi
    Nov 13 at 4:05













up vote
0
down vote

favorite









up vote
0
down vote

favorite











I am currently setup airflow scheduler in Linux server A and airflow web server in Linux server B. Both server has no Internet access. I have start the initDB in server A and keep all the dags in server A.



However, when i refresh the webserver UI, it keep having error message:-



This DAG isn't available in the webserver DagBag object


How do i configure the dag folder for web server (server B) to read the dag from scheduler (server A)?



I am using bashoperator. Is that Celery Operator is a must?



Thanks in advance










share|improve this question















I am currently setup airflow scheduler in Linux server A and airflow web server in Linux server B. Both server has no Internet access. I have start the initDB in server A and keep all the dags in server A.



However, when i refresh the webserver UI, it keep having error message:-



This DAG isn't available in the webserver DagBag object


How do i configure the dag folder for web server (server B) to read the dag from scheduler (server A)?



I am using bashoperator. Is that Celery Operator is a must?



Thanks in advance







airflow






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 13 at 3:25









sulabh chaturvedi

285214




285214










asked Nov 12 at 6:55









i2cute

52




52












  • My set up slightly different. Both scheduler and webserver are in different server.
    – i2cute
    Nov 13 at 3:28












  • How exactly does that make product behaviour Different ?
    – sulabh chaturvedi
    Nov 13 at 3:33










  • Product behavior should be same as I thought. But webserver need to read the dags which stored in proxy (different server) and I having problems on this. How can I share the dags without copying over?
    – i2cute
    Nov 13 at 3:45










  • Possible Duplicate of - stackoverflow.com/questions/47834925/…
    – sulabh chaturvedi
    Nov 13 at 4:05


















  • My set up slightly different. Both scheduler and webserver are in different server.
    – i2cute
    Nov 13 at 3:28












  • How exactly does that make product behaviour Different ?
    – sulabh chaturvedi
    Nov 13 at 3:33










  • Product behavior should be same as I thought. But webserver need to read the dags which stored in proxy (different server) and I having problems on this. How can I share the dags without copying over?
    – i2cute
    Nov 13 at 3:45










  • Possible Duplicate of - stackoverflow.com/questions/47834925/…
    – sulabh chaturvedi
    Nov 13 at 4:05
















My set up slightly different. Both scheduler and webserver are in different server.
– i2cute
Nov 13 at 3:28






My set up slightly different. Both scheduler and webserver are in different server.
– i2cute
Nov 13 at 3:28














How exactly does that make product behaviour Different ?
– sulabh chaturvedi
Nov 13 at 3:33




How exactly does that make product behaviour Different ?
– sulabh chaturvedi
Nov 13 at 3:33












Product behavior should be same as I thought. But webserver need to read the dags which stored in proxy (different server) and I having problems on this. How can I share the dags without copying over?
– i2cute
Nov 13 at 3:45




Product behavior should be same as I thought. But webserver need to read the dags which stored in proxy (different server) and I having problems on this. How can I share the dags without copying over?
– i2cute
Nov 13 at 3:45












Possible Duplicate of - stackoverflow.com/questions/47834925/…
– sulabh chaturvedi
Nov 13 at 4:05




Possible Duplicate of - stackoverflow.com/questions/47834925/…
– sulabh chaturvedi
Nov 13 at 4:05












1 Answer
1






active

oldest

votes

















up vote
0
down vote



accepted










The scheduler has found your dags_folder, and its processes, and is scheduling them accordingly. The webserver however can "see" these processes solely by their existence in the database but can't find them in its dags_folder path.



You need to ensure that the dags_folder for both servers contain the same files, and that both are kept in sync with one another. This is out of scope for Airflow and it won't handle this on your behalf.






share|improve this answer





















  • Do you mean that I need to copy the dags from Linux A (Scheduler) to Linux B (web server)? Any easy way? Like to map drive and so on?
    – i2cute
    Nov 13 at 1:59










  • Essentially, yes. How you accomplish that is up to you.
    – joeb
    Nov 13 at 1:59










  • Thanks. I still figure out how to map the drive
    – i2cute
    Nov 13 at 2:06











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53257188%2fairflow-this-dag-isnt-available-in-the-webserver-dagbag-object%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
0
down vote



accepted










The scheduler has found your dags_folder, and its processes, and is scheduling them accordingly. The webserver however can "see" these processes solely by their existence in the database but can't find them in its dags_folder path.



You need to ensure that the dags_folder for both servers contain the same files, and that both are kept in sync with one another. This is out of scope for Airflow and it won't handle this on your behalf.






share|improve this answer





















  • Do you mean that I need to copy the dags from Linux A (Scheduler) to Linux B (web server)? Any easy way? Like to map drive and so on?
    – i2cute
    Nov 13 at 1:59










  • Essentially, yes. How you accomplish that is up to you.
    – joeb
    Nov 13 at 1:59










  • Thanks. I still figure out how to map the drive
    – i2cute
    Nov 13 at 2:06















up vote
0
down vote



accepted










The scheduler has found your dags_folder, and its processes, and is scheduling them accordingly. The webserver however can "see" these processes solely by their existence in the database but can't find them in its dags_folder path.



You need to ensure that the dags_folder for both servers contain the same files, and that both are kept in sync with one another. This is out of scope for Airflow and it won't handle this on your behalf.






share|improve this answer





















  • Do you mean that I need to copy the dags from Linux A (Scheduler) to Linux B (web server)? Any easy way? Like to map drive and so on?
    – i2cute
    Nov 13 at 1:59










  • Essentially, yes. How you accomplish that is up to you.
    – joeb
    Nov 13 at 1:59










  • Thanks. I still figure out how to map the drive
    – i2cute
    Nov 13 at 2:06













up vote
0
down vote



accepted







up vote
0
down vote



accepted






The scheduler has found your dags_folder, and its processes, and is scheduling them accordingly. The webserver however can "see" these processes solely by their existence in the database but can't find them in its dags_folder path.



You need to ensure that the dags_folder for both servers contain the same files, and that both are kept in sync with one another. This is out of scope for Airflow and it won't handle this on your behalf.






share|improve this answer












The scheduler has found your dags_folder, and its processes, and is scheduling them accordingly. The webserver however can "see" these processes solely by their existence in the database but can't find them in its dags_folder path.



You need to ensure that the dags_folder for both servers contain the same files, and that both are kept in sync with one another. This is out of scope for Airflow and it won't handle this on your behalf.







share|improve this answer












share|improve this answer



share|improve this answer










answered Nov 12 at 19:42









joeb

2,08611519




2,08611519












  • Do you mean that I need to copy the dags from Linux A (Scheduler) to Linux B (web server)? Any easy way? Like to map drive and so on?
    – i2cute
    Nov 13 at 1:59










  • Essentially, yes. How you accomplish that is up to you.
    – joeb
    Nov 13 at 1:59










  • Thanks. I still figure out how to map the drive
    – i2cute
    Nov 13 at 2:06


















  • Do you mean that I need to copy the dags from Linux A (Scheduler) to Linux B (web server)? Any easy way? Like to map drive and so on?
    – i2cute
    Nov 13 at 1:59










  • Essentially, yes. How you accomplish that is up to you.
    – joeb
    Nov 13 at 1:59










  • Thanks. I still figure out how to map the drive
    – i2cute
    Nov 13 at 2:06
















Do you mean that I need to copy the dags from Linux A (Scheduler) to Linux B (web server)? Any easy way? Like to map drive and so on?
– i2cute
Nov 13 at 1:59




Do you mean that I need to copy the dags from Linux A (Scheduler) to Linux B (web server)? Any easy way? Like to map drive and so on?
– i2cute
Nov 13 at 1:59












Essentially, yes. How you accomplish that is up to you.
– joeb
Nov 13 at 1:59




Essentially, yes. How you accomplish that is up to you.
– joeb
Nov 13 at 1:59












Thanks. I still figure out how to map the drive
– i2cute
Nov 13 at 2:06




Thanks. I still figure out how to map the drive
– i2cute
Nov 13 at 2:06


















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53257188%2fairflow-this-dag-isnt-available-in-the-webserver-dagbag-object%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Xamarin.iOS Cant Deploy on Iphone

Glorious Revolution

Dulmage-Mendelsohn matrix decomposition in Python