I have a logins table which looks like this:
person_id | login_at | points_won
-----------+----------------+----------------------
1 | 2017-02-02 |
1 | 2017-02-01 |
2 | 2017-02-01 | 2
1 | 2017-01-29 | 2
2 | 2017-01-28 |
2 | 2017-01-25 | 1
3 | 2017-01-22 |
3 | 2017-01-21 |
1 | 2017-01-10 | 3
1 | 2017-01-01 | 1
I want to generate a result set containing a points_won column, which should work something like: For each row partition based on the person_id order the partition by login_at desc then report the first occurrence (not null) of last_points_won of the ordered rows in the partition (if any).
It should result in something like this:
person_id | login_at | points_won | last_points_won
-----------+----------------+----------------------+----------------------
1 | 2017-02-02 | | 2
1 | 2017-02-01 | | 2
2 | 2017-02-01 | 2 | 2
1 | 2017-01-29 | 2 | 2
2 | 2017-01-28 | | 1
2 | 2017-01-25 | 1 | 1
3 | 2017-01-22 | |
3 | 2017-01-21 | |
1 | 2017-01-10 | 3 | 3
1 | 2017-01-01 | 1 | 1
Or in plain words:
for each row, give me either the points won during this login OR if none, give me the points won at the persons latest previous login, where he actually made some points.
This could be achieved within a single window too, with the IGNORE NULLS option of the last_value() window function. But that's not supported in PostgreSQL yet. One alternative is the FILTER (WHERE ...) clause, but that will only work, when the window function is an aggregate function in the first place (which is not true for last_value(), but something similar could be created easily with CREATE AGGREGATE). To solve this with only built-in aggregates, you can use the array_agg() too:
SELECT (tbl).*,
all_points_won[array_upper(all_points_won, 1)] last_points_won
FROM (SELECT tbl,
array_agg(points_won)
FILTER (WHERE points_won IS NOT NULL)
OVER (PARTITION BY person_id ORDER BY login_at) all_points_won
FROM tbl) s
Note: the sub-query is not needed, if you create a dedicated last_agg() aggregate, like:
CREATE FUNCTION last_val(anyelement, anyelement)
RETURNS anyelement
LANGUAGE SQL
IMMUTABLE
CALLED ON NULL INPUT
AS 'SELECT $2';
CREATE AGGREGATE last_agg(anyelement) (
SFUNC = last_val,
STYPE = anyelement
);
SELECT tbl.*,
last_agg(points_won)
FILTER (WHERE points_won IS NOT NULL)
OVER (PARTITION BY person_id ORDER BY login_at) last_points_won
FROM tbl;
Rextester sample
Edit: once the IGNORE NULLS option will be supported on PostgreSQL, you can use the following query (which should work in Amazon Redshift too):
SELECT tbl.*,
last_value(points_won IGNORE NULLS)
OVER (PARTITION BY person_id ORDER BY login_at ROW BETWEEN UNBOUNCED PRECEDING AND CURRENT ROW) last_points_won
FROM tbl;
select *
,min(points_won) over
(
partition by person_id,group_id
) as last_points_won
from (select *
,count(points_won) over
(
partition by person_id
order by login_at
) as group_id
from mytable
) t
+-----------+------------+------------+----------+-----------------+
| person_id | login_at | points_won | group_id | last_points_won |
+-----------+------------+------------+----------+-----------------+
| 1 | 2017-01-01 | 1 | 1 | 1 |
+-----------+------------+------------+----------+-----------------+
| 1 | 2017-01-10 | 3 | 2 | 3 |
+-----------+------------+------------+----------+-----------------+
| 1 | 2017-01-29 | 2 | 3 | 2 |
+-----------+------------+------------+----------+-----------------+
| 1 | 2017-02-01 | (null) | 3 | 2 |
+-----------+------------+------------+----------+-----------------+
| 1 | 2017-02-02 | (null) | 3 | 2 |
+-----------+------------+------------+----------+-----------------+
| 2 | 2017-01-25 | 1 | 1 | 1 |
+-----------+------------+------------+----------+-----------------+
| 2 | 2017-01-28 | (null) | 1 | 1 |
+-----------+------------+------------+----------+-----------------+
| 2 | 2017-02-01 | 2 | 2 | 2 |
+-----------+------------+------------+----------+-----------------+
| 3 | 2017-01-21 | (null) | 0 | (null) |
+-----------+------------+------------+----------+-----------------+
| 3 | 2017-01-22 | (null) | 0 | (null) |
+-----------+------------+------------+----------+-----------------+
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With