Grouping by time difference in Amazon Redshift












0















I'm using the following query:



SELECT a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) AS events
FROM table1 a
JOIN table1 b ON a.session_id = b.session_id
GROUP BY a.session_id,
a.created_at
ORDER BY a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) DESC


to get the following result:



Session1    2018-10-09 14:04:31.0   22
Session1 2018-10-09 14:04:32.0 10
Session1 2018-10-09 14:04:34.0 1
Session1 2018-10-09 14:04:38.0 1
Session1 2018-10-09 14:04:41.0 1
Session1 2018-10-09 14:04:42.0 1
Session1 2018-10-09 14:04:43.0 2
Session1 2018-10-09 14:04:44.0 2
Session1 2018-10-09 14:04:45.0 1
Session1 2018-10-09 14:04:46.0 2
Session1 2018-10-09 14:04:47.0 2
Session1 2018-10-09 14:04:50.0 2
Session1 2018-10-09 14:04:51.0 2
Session1 2018-10-09 14:04:52.0 1
Session1 2018-10-09 14:04:53.0 1
Session1 2018-10-09 14:04:55.0 1
Session1 2018-10-09 14:04:56.0 1
Session1 2018-10-09 14:04:57.0 1
Session1 2018-10-09 14:05:00.0 1
Session1 2018-10-09 14:05:01.0 2
Session1 2018-10-09 14:05:03.0 3
Session1 2018-10-09 14:05:06.0 1
Session1 2018-10-09 14:05:07.0 2
Session1 2018-10-09 14:05:09.0 4
Session1 2018-10-09 14:05:10.0 30


I would like to group all the events occuring within a 3 second window to get the following result:



Session1    2018-10-09 14:04:31.0   33
Session1 2018-10-09 14:04:38.0 2
Session1 2018-10-09 14:04:42.0 6
Session1 2018-10-09 14:04:46.0 4
Session1 2018-10-09 14:04:50.0 6
Session1 2018-10-09 14:04:55.0 3
Session1 2018-10-09 14:05:00.0 6
Session1 2018-10-09 14:05:06.0 7
Session1 2018-10-09 14:05:10.0 30


I would like to sum all the occurences within a 3 second period to get the resultant column, as shown above.



To try to achieve this, I used the following query:



WITH t AS
(
SELECT a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) AS events
FROM table1 a
JOIN table1 b ON a.session_id = b.session_id
GROUP BY a.session_id,
a.created_at
ORDER BY a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) DESC
)
SELECT a.session_id,
TIMESTAMP WITH TIME ZONE 'epoch' +INTERVAL '1 second' *ROUND(EXTRACT('epoch' FROM a.created_at) / 3)*3 AS TIMESTAMP,
SUM(b.events)
FROM t AS a
JOIN t AS b ON a.session_id = b.session_id
GROUP BY a.session_id,
ROUND(EXTRACT('epoch' FROM a.created_at) / 3)
ORDER BY a.session_id,
TIMESTAMP


but this gives me numbers that are incorrect.



How do I achieve this? Any help would be much appreciated.










share|improve this question

























  • I don't understand your query. You are not using table b.

    – Gordon Linoff
    Nov 13 '18 at 22:06











  • please remove postgresql tag

    – Jon Scott
    Nov 13 '18 at 22:57











  • I'm confused: 1. Do you want to assign the start of the three seconds based on the data OR follow a fixed pattern [0-3,3-6,6-9,...]. 2. Why are you joining t back to t when you are not using it? This is just asking for trouble!

    – hibernado
    Nov 14 '18 at 8:31


















0















I'm using the following query:



SELECT a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) AS events
FROM table1 a
JOIN table1 b ON a.session_id = b.session_id
GROUP BY a.session_id,
a.created_at
ORDER BY a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) DESC


to get the following result:



Session1    2018-10-09 14:04:31.0   22
Session1 2018-10-09 14:04:32.0 10
Session1 2018-10-09 14:04:34.0 1
Session1 2018-10-09 14:04:38.0 1
Session1 2018-10-09 14:04:41.0 1
Session1 2018-10-09 14:04:42.0 1
Session1 2018-10-09 14:04:43.0 2
Session1 2018-10-09 14:04:44.0 2
Session1 2018-10-09 14:04:45.0 1
Session1 2018-10-09 14:04:46.0 2
Session1 2018-10-09 14:04:47.0 2
Session1 2018-10-09 14:04:50.0 2
Session1 2018-10-09 14:04:51.0 2
Session1 2018-10-09 14:04:52.0 1
Session1 2018-10-09 14:04:53.0 1
Session1 2018-10-09 14:04:55.0 1
Session1 2018-10-09 14:04:56.0 1
Session1 2018-10-09 14:04:57.0 1
Session1 2018-10-09 14:05:00.0 1
Session1 2018-10-09 14:05:01.0 2
Session1 2018-10-09 14:05:03.0 3
Session1 2018-10-09 14:05:06.0 1
Session1 2018-10-09 14:05:07.0 2
Session1 2018-10-09 14:05:09.0 4
Session1 2018-10-09 14:05:10.0 30


I would like to group all the events occuring within a 3 second window to get the following result:



Session1    2018-10-09 14:04:31.0   33
Session1 2018-10-09 14:04:38.0 2
Session1 2018-10-09 14:04:42.0 6
Session1 2018-10-09 14:04:46.0 4
Session1 2018-10-09 14:04:50.0 6
Session1 2018-10-09 14:04:55.0 3
Session1 2018-10-09 14:05:00.0 6
Session1 2018-10-09 14:05:06.0 7
Session1 2018-10-09 14:05:10.0 30


I would like to sum all the occurences within a 3 second period to get the resultant column, as shown above.



To try to achieve this, I used the following query:



WITH t AS
(
SELECT a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) AS events
FROM table1 a
JOIN table1 b ON a.session_id = b.session_id
GROUP BY a.session_id,
a.created_at
ORDER BY a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) DESC
)
SELECT a.session_id,
TIMESTAMP WITH TIME ZONE 'epoch' +INTERVAL '1 second' *ROUND(EXTRACT('epoch' FROM a.created_at) / 3)*3 AS TIMESTAMP,
SUM(b.events)
FROM t AS a
JOIN t AS b ON a.session_id = b.session_id
GROUP BY a.session_id,
ROUND(EXTRACT('epoch' FROM a.created_at) / 3)
ORDER BY a.session_id,
TIMESTAMP


but this gives me numbers that are incorrect.



How do I achieve this? Any help would be much appreciated.










share|improve this question

























  • I don't understand your query. You are not using table b.

    – Gordon Linoff
    Nov 13 '18 at 22:06











  • please remove postgresql tag

    – Jon Scott
    Nov 13 '18 at 22:57











  • I'm confused: 1. Do you want to assign the start of the three seconds based on the data OR follow a fixed pattern [0-3,3-6,6-9,...]. 2. Why are you joining t back to t when you are not using it? This is just asking for trouble!

    – hibernado
    Nov 14 '18 at 8:31
















0












0








0








I'm using the following query:



SELECT a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) AS events
FROM table1 a
JOIN table1 b ON a.session_id = b.session_id
GROUP BY a.session_id,
a.created_at
ORDER BY a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) DESC


to get the following result:



Session1    2018-10-09 14:04:31.0   22
Session1 2018-10-09 14:04:32.0 10
Session1 2018-10-09 14:04:34.0 1
Session1 2018-10-09 14:04:38.0 1
Session1 2018-10-09 14:04:41.0 1
Session1 2018-10-09 14:04:42.0 1
Session1 2018-10-09 14:04:43.0 2
Session1 2018-10-09 14:04:44.0 2
Session1 2018-10-09 14:04:45.0 1
Session1 2018-10-09 14:04:46.0 2
Session1 2018-10-09 14:04:47.0 2
Session1 2018-10-09 14:04:50.0 2
Session1 2018-10-09 14:04:51.0 2
Session1 2018-10-09 14:04:52.0 1
Session1 2018-10-09 14:04:53.0 1
Session1 2018-10-09 14:04:55.0 1
Session1 2018-10-09 14:04:56.0 1
Session1 2018-10-09 14:04:57.0 1
Session1 2018-10-09 14:05:00.0 1
Session1 2018-10-09 14:05:01.0 2
Session1 2018-10-09 14:05:03.0 3
Session1 2018-10-09 14:05:06.0 1
Session1 2018-10-09 14:05:07.0 2
Session1 2018-10-09 14:05:09.0 4
Session1 2018-10-09 14:05:10.0 30


I would like to group all the events occuring within a 3 second window to get the following result:



Session1    2018-10-09 14:04:31.0   33
Session1 2018-10-09 14:04:38.0 2
Session1 2018-10-09 14:04:42.0 6
Session1 2018-10-09 14:04:46.0 4
Session1 2018-10-09 14:04:50.0 6
Session1 2018-10-09 14:04:55.0 3
Session1 2018-10-09 14:05:00.0 6
Session1 2018-10-09 14:05:06.0 7
Session1 2018-10-09 14:05:10.0 30


I would like to sum all the occurences within a 3 second period to get the resultant column, as shown above.



To try to achieve this, I used the following query:



WITH t AS
(
SELECT a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) AS events
FROM table1 a
JOIN table1 b ON a.session_id = b.session_id
GROUP BY a.session_id,
a.created_at
ORDER BY a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) DESC
)
SELECT a.session_id,
TIMESTAMP WITH TIME ZONE 'epoch' +INTERVAL '1 second' *ROUND(EXTRACT('epoch' FROM a.created_at) / 3)*3 AS TIMESTAMP,
SUM(b.events)
FROM t AS a
JOIN t AS b ON a.session_id = b.session_id
GROUP BY a.session_id,
ROUND(EXTRACT('epoch' FROM a.created_at) / 3)
ORDER BY a.session_id,
TIMESTAMP


but this gives me numbers that are incorrect.



How do I achieve this? Any help would be much appreciated.










share|improve this question
















I'm using the following query:



SELECT a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) AS events
FROM table1 a
JOIN table1 b ON a.session_id = b.session_id
GROUP BY a.session_id,
a.created_at
ORDER BY a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) DESC


to get the following result:



Session1    2018-10-09 14:04:31.0   22
Session1 2018-10-09 14:04:32.0 10
Session1 2018-10-09 14:04:34.0 1
Session1 2018-10-09 14:04:38.0 1
Session1 2018-10-09 14:04:41.0 1
Session1 2018-10-09 14:04:42.0 1
Session1 2018-10-09 14:04:43.0 2
Session1 2018-10-09 14:04:44.0 2
Session1 2018-10-09 14:04:45.0 1
Session1 2018-10-09 14:04:46.0 2
Session1 2018-10-09 14:04:47.0 2
Session1 2018-10-09 14:04:50.0 2
Session1 2018-10-09 14:04:51.0 2
Session1 2018-10-09 14:04:52.0 1
Session1 2018-10-09 14:04:53.0 1
Session1 2018-10-09 14:04:55.0 1
Session1 2018-10-09 14:04:56.0 1
Session1 2018-10-09 14:04:57.0 1
Session1 2018-10-09 14:05:00.0 1
Session1 2018-10-09 14:05:01.0 2
Session1 2018-10-09 14:05:03.0 3
Session1 2018-10-09 14:05:06.0 1
Session1 2018-10-09 14:05:07.0 2
Session1 2018-10-09 14:05:09.0 4
Session1 2018-10-09 14:05:10.0 30


I would like to group all the events occuring within a 3 second window to get the following result:



Session1    2018-10-09 14:04:31.0   33
Session1 2018-10-09 14:04:38.0 2
Session1 2018-10-09 14:04:42.0 6
Session1 2018-10-09 14:04:46.0 4
Session1 2018-10-09 14:04:50.0 6
Session1 2018-10-09 14:04:55.0 3
Session1 2018-10-09 14:05:00.0 6
Session1 2018-10-09 14:05:06.0 7
Session1 2018-10-09 14:05:10.0 30


I would like to sum all the occurences within a 3 second period to get the resultant column, as shown above.



To try to achieve this, I used the following query:



WITH t AS
(
SELECT a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) AS events
FROM table1 a
JOIN table1 b ON a.session_id = b.session_id
GROUP BY a.session_id,
a.created_at
ORDER BY a.session_id,
a.created_at,
COUNT(DISTINCT a.mongo_id) DESC
)
SELECT a.session_id,
TIMESTAMP WITH TIME ZONE 'epoch' +INTERVAL '1 second' *ROUND(EXTRACT('epoch' FROM a.created_at) / 3)*3 AS TIMESTAMP,
SUM(b.events)
FROM t AS a
JOIN t AS b ON a.session_id = b.session_id
GROUP BY a.session_id,
ROUND(EXTRACT('epoch' FROM a.created_at) / 3)
ORDER BY a.session_id,
TIMESTAMP


but this gives me numbers that are incorrect.



How do I achieve this? Any help would be much appreciated.







sql amazon-redshift






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 13 '18 at 23:09







Patthebug

















asked Nov 13 '18 at 22:05









PatthebugPatthebug

1,49842254




1,49842254













  • I don't understand your query. You are not using table b.

    – Gordon Linoff
    Nov 13 '18 at 22:06











  • please remove postgresql tag

    – Jon Scott
    Nov 13 '18 at 22:57











  • I'm confused: 1. Do you want to assign the start of the three seconds based on the data OR follow a fixed pattern [0-3,3-6,6-9,...]. 2. Why are you joining t back to t when you are not using it? This is just asking for trouble!

    – hibernado
    Nov 14 '18 at 8:31





















  • I don't understand your query. You are not using table b.

    – Gordon Linoff
    Nov 13 '18 at 22:06











  • please remove postgresql tag

    – Jon Scott
    Nov 13 '18 at 22:57











  • I'm confused: 1. Do you want to assign the start of the three seconds based on the data OR follow a fixed pattern [0-3,3-6,6-9,...]. 2. Why are you joining t back to t when you are not using it? This is just asking for trouble!

    – hibernado
    Nov 14 '18 at 8:31



















I don't understand your query. You are not using table b.

– Gordon Linoff
Nov 13 '18 at 22:06





I don't understand your query. You are not using table b.

– Gordon Linoff
Nov 13 '18 at 22:06













please remove postgresql tag

– Jon Scott
Nov 13 '18 at 22:57





please remove postgresql tag

– Jon Scott
Nov 13 '18 at 22:57













I'm confused: 1. Do you want to assign the start of the three seconds based on the data OR follow a fixed pattern [0-3,3-6,6-9,...]. 2. Why are you joining t back to t when you are not using it? This is just asking for trouble!

– hibernado
Nov 14 '18 at 8:31







I'm confused: 1. Do you want to assign the start of the three seconds based on the data OR follow a fixed pattern [0-3,3-6,6-9,...]. 2. Why are you joining t back to t when you are not using it? This is just asking for trouble!

– hibernado
Nov 14 '18 at 8:31














1 Answer
1






active

oldest

votes


















0














Let me assume you get the result you specify somehow. Then you can use window functions:



with results as (
<whatever>
)
select sessionid, min(created_at), max(created_at), sum(events)
from (select r.*,
sum( (prev_ca < created_at - interval '3 second')::int ) over (partition by sessionid order by created_at rows between unbounded preceding and current row) as grp
from (select r.*,
lag(created_at) over (partition by sessionid order by created_at) as prev_ca
from results r
) r
) r
group by sessionid, grp;


What this does is determine where a group starts, by looking at the previous created_at and seeing if it is more than 3 seconds earlier. If so, a group starts.



The cumulative sum of the group starts is a grouping identifier, which can be used for aggregation.






share|improve this answer


























  • This query doesn't execute. It doesn't seem to use the table results and the table r is used twice. I'm sorry but I'm a little confused.

    – Patthebug
    Nov 13 '18 at 22:20











  • @Patthebug . . . It now has the from clause.

    – Gordon Linoff
    Nov 13 '18 at 22:21











  • I now get the following error: Invalid operation: Aggregate window functions with an ORDER BY clause require a frame clause;

    – Patthebug
    Nov 13 '18 at 22:23











  • @Patthebug . . . That is a peculiarity of Redshift SQL.

    – Gordon Linoff
    Nov 13 '18 at 22:34











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53290216%2fgrouping-by-time-difference-in-amazon-redshift%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0














Let me assume you get the result you specify somehow. Then you can use window functions:



with results as (
<whatever>
)
select sessionid, min(created_at), max(created_at), sum(events)
from (select r.*,
sum( (prev_ca < created_at - interval '3 second')::int ) over (partition by sessionid order by created_at rows between unbounded preceding and current row) as grp
from (select r.*,
lag(created_at) over (partition by sessionid order by created_at) as prev_ca
from results r
) r
) r
group by sessionid, grp;


What this does is determine where a group starts, by looking at the previous created_at and seeing if it is more than 3 seconds earlier. If so, a group starts.



The cumulative sum of the group starts is a grouping identifier, which can be used for aggregation.






share|improve this answer


























  • This query doesn't execute. It doesn't seem to use the table results and the table r is used twice. I'm sorry but I'm a little confused.

    – Patthebug
    Nov 13 '18 at 22:20











  • @Patthebug . . . It now has the from clause.

    – Gordon Linoff
    Nov 13 '18 at 22:21











  • I now get the following error: Invalid operation: Aggregate window functions with an ORDER BY clause require a frame clause;

    – Patthebug
    Nov 13 '18 at 22:23











  • @Patthebug . . . That is a peculiarity of Redshift SQL.

    – Gordon Linoff
    Nov 13 '18 at 22:34
















0














Let me assume you get the result you specify somehow. Then you can use window functions:



with results as (
<whatever>
)
select sessionid, min(created_at), max(created_at), sum(events)
from (select r.*,
sum( (prev_ca < created_at - interval '3 second')::int ) over (partition by sessionid order by created_at rows between unbounded preceding and current row) as grp
from (select r.*,
lag(created_at) over (partition by sessionid order by created_at) as prev_ca
from results r
) r
) r
group by sessionid, grp;


What this does is determine where a group starts, by looking at the previous created_at and seeing if it is more than 3 seconds earlier. If so, a group starts.



The cumulative sum of the group starts is a grouping identifier, which can be used for aggregation.






share|improve this answer


























  • This query doesn't execute. It doesn't seem to use the table results and the table r is used twice. I'm sorry but I'm a little confused.

    – Patthebug
    Nov 13 '18 at 22:20











  • @Patthebug . . . It now has the from clause.

    – Gordon Linoff
    Nov 13 '18 at 22:21











  • I now get the following error: Invalid operation: Aggregate window functions with an ORDER BY clause require a frame clause;

    – Patthebug
    Nov 13 '18 at 22:23











  • @Patthebug . . . That is a peculiarity of Redshift SQL.

    – Gordon Linoff
    Nov 13 '18 at 22:34














0












0








0







Let me assume you get the result you specify somehow. Then you can use window functions:



with results as (
<whatever>
)
select sessionid, min(created_at), max(created_at), sum(events)
from (select r.*,
sum( (prev_ca < created_at - interval '3 second')::int ) over (partition by sessionid order by created_at rows between unbounded preceding and current row) as grp
from (select r.*,
lag(created_at) over (partition by sessionid order by created_at) as prev_ca
from results r
) r
) r
group by sessionid, grp;


What this does is determine where a group starts, by looking at the previous created_at and seeing if it is more than 3 seconds earlier. If so, a group starts.



The cumulative sum of the group starts is a grouping identifier, which can be used for aggregation.






share|improve this answer















Let me assume you get the result you specify somehow. Then you can use window functions:



with results as (
<whatever>
)
select sessionid, min(created_at), max(created_at), sum(events)
from (select r.*,
sum( (prev_ca < created_at - interval '3 second')::int ) over (partition by sessionid order by created_at rows between unbounded preceding and current row) as grp
from (select r.*,
lag(created_at) over (partition by sessionid order by created_at) as prev_ca
from results r
) r
) r
group by sessionid, grp;


What this does is determine where a group starts, by looking at the previous created_at and seeing if it is more than 3 seconds earlier. If so, a group starts.



The cumulative sum of the group starts is a grouping identifier, which can be used for aggregation.







share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 13 '18 at 22:34

























answered Nov 13 '18 at 22:13









Gordon LinoffGordon Linoff

766k35300402




766k35300402













  • This query doesn't execute. It doesn't seem to use the table results and the table r is used twice. I'm sorry but I'm a little confused.

    – Patthebug
    Nov 13 '18 at 22:20











  • @Patthebug . . . It now has the from clause.

    – Gordon Linoff
    Nov 13 '18 at 22:21











  • I now get the following error: Invalid operation: Aggregate window functions with an ORDER BY clause require a frame clause;

    – Patthebug
    Nov 13 '18 at 22:23











  • @Patthebug . . . That is a peculiarity of Redshift SQL.

    – Gordon Linoff
    Nov 13 '18 at 22:34



















  • This query doesn't execute. It doesn't seem to use the table results and the table r is used twice. I'm sorry but I'm a little confused.

    – Patthebug
    Nov 13 '18 at 22:20











  • @Patthebug . . . It now has the from clause.

    – Gordon Linoff
    Nov 13 '18 at 22:21











  • I now get the following error: Invalid operation: Aggregate window functions with an ORDER BY clause require a frame clause;

    – Patthebug
    Nov 13 '18 at 22:23











  • @Patthebug . . . That is a peculiarity of Redshift SQL.

    – Gordon Linoff
    Nov 13 '18 at 22:34

















This query doesn't execute. It doesn't seem to use the table results and the table r is used twice. I'm sorry but I'm a little confused.

– Patthebug
Nov 13 '18 at 22:20





This query doesn't execute. It doesn't seem to use the table results and the table r is used twice. I'm sorry but I'm a little confused.

– Patthebug
Nov 13 '18 at 22:20













@Patthebug . . . It now has the from clause.

– Gordon Linoff
Nov 13 '18 at 22:21





@Patthebug . . . It now has the from clause.

– Gordon Linoff
Nov 13 '18 at 22:21













I now get the following error: Invalid operation: Aggregate window functions with an ORDER BY clause require a frame clause;

– Patthebug
Nov 13 '18 at 22:23





I now get the following error: Invalid operation: Aggregate window functions with an ORDER BY clause require a frame clause;

– Patthebug
Nov 13 '18 at 22:23













@Patthebug . . . That is a peculiarity of Redshift SQL.

– Gordon Linoff
Nov 13 '18 at 22:34





@Patthebug . . . That is a peculiarity of Redshift SQL.

– Gordon Linoff
Nov 13 '18 at 22:34


















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53290216%2fgrouping-by-time-difference-in-amazon-redshift%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Bressuire

Vorschmack

Quarantine