#1731920421
[ mb_dev ]
Got my first contribution and fork on this project. Happy to see his version deployed and taking on his own look. This project was made to be forked, I hope more people follow suit.
Hi Tim o/
▄▄▄·▪ ▐ ▄ ▄▄▄ . ▐█ ▄███ •█▌▐█▀▄.▀· ██▀·▐█·▐█▐▐▌▐▀▀▪▄ ▐█▪·•▐█▌██▐█▌▐█▄▄▌ .▀ ▀▀▀▀▀ █▪ ▀▀▀
https://pine32.be - © pine32.be 2024
Welcome! - 64 total posts. [RSS]
A funny little cycle. [LATEST]
#1731920421
Got my first contribution and fork on this project. Happy to see his version deployed and taking on his own look. This project was made to be forked, I hope more people follow suit.
Hi Tim o/
#1731844234
No, Billy, I haven’t done that dance since my wife died
#1731188466
I am making a scraper for Spotify meta data. My testing numbers indicates that I could scrape 100% of Spotify in less then a week, something feels wrong.
INFO Stats per minute id=0 request=204 tracks=2451
INFO Stats per minute id=2 request=193 tracks=2086
INFO Stats per minute id=1 request=212 tracks=2392
#1730630725
Golangs new integrators came in handy for request pagination. I know the code is not optimal but is very readable and it is just for a proof of concept. I am try to get Spotify metadata in bulk. Hopefully I won’t get IP banned, fingers crossed.
for chunk := range slices.Chunk(allSimpleTracks, 100) {
ids := make([]spotify.ID, len(chunk))
for i, a := range chunk {
ids[i] = a.ID
}
f, err := client.GetAudioFeatures(ctx, ids...)
if err != nil {
return nil, err
}
fullTracks := make([]*spotify.FullTrack, len(ids))
for subChunk := range slices.Chunk(ids, 50) {
full, err := client.GetTracks(ctx, subChunk, spotify.Limit(50))
if err != nil {
return nil, err
}
fullTracks = append(fullTracks, full...)
}
for i := range len(ids) {
allTracks[i] = &FullerTrack{
Track: fullTracks[i],
Features: f[i],
}
}
}
#1730070397
Can’t get enough of this guy, makes me want to get into drumming.
#1729278424
This is peak internet for me. Everybody should have a site like this instead of social media.
#1728621397
3-6-5-3-6-5-3-6-5-3-6-5-3-6-5-3-6-5-3-6-5…
#1728318672
Apart from a few minor glitches (which have been fixed) the scraper and scheduler runs fine. The frontend is also coming along nicely, needs a few more pages and then the CSS. And I also need to figure out how to load a dynamic amount of columns from a materialized view, shouldn’t be to hard but I want to make it fault tolerant. I don’t know what I want to do regarding design. But I know some body that maybe wants to help me, fingers crossed.
#1727454186
DOWNTEMPO
#1727195369
Python (scraper) rewrite is done. Almost no dependencies now. Reduced the Docker image from 1.2 GB to less then 100MB. Feels a lot better to update and modify to. Now time for the fronted webserver.
beautifulsoup4==4.12.3
requests==2.32.3
python-dotenv==1.0.1
psycopg==3.2.2
psycopg-binary==3.2.2
#1726927465
The amount of cursed SQL that I am writing just to keep it in pure SQL. It would be way faster to just make the query in Python. Anyway… Corap rewrite is coming along nicely.
DO $$
DECLARE
cols text;
query text;
BEGIN
SELECT string_agg(quote_ident(name) || ' text', ', ')
INTO cols
FROM (
SELECT name
FROM (SELECT DISTINCT name, priority FROM device_analyses) AS o
ORDER BY priority DESC
) AS o;
BEGIN
EXECUTE 'DROP MATERIALIZED VIEW IF EXISTS device_analysis_summary';
query := format('
CREATE MATERIALIZED VIEW device_analysis_summary AS
SELECT *
FROM crosstab(
''SELECT d.deveui, da.name, da.value
FROM devices d
LEFT JOIN device_analyses da ON d.deveui = da.device_id
ORDER BY d.deveui, da.name'',
''SELECT name
FROM (SELECT DISTINCT name, priority FROM device_analyses) AS o
ORDER BY priority DESC''
) AS ct(deveui text, %s);
', cols);
EXECUTE query;
EXCEPTION
WHEN OTHERS THEN
RAISE NOTICE 'Error creating materialized view: %', SQLERRM;
ROLLBACK;
RETURN;
END;
END $$;
#1726858812
Brussels based proto-techno
#1726662585
Time to rewrite Corap finally, starting with the scheduler. The current docker images is more then 1 GB. Going to remove a lot of dependencies. Also going to rewrite the fronted, learned a lot about Golang sins starting that project.
#1724482570
sqr_dump_2!! Found some nice older photo’s while making backups. Wasn’t soon-ish like I said but soon enough.
#1723758927
Please forgive because I have zyned, and will zyn again
#1723020840
#1722452920
Templ is now a recognized language on GitHub. Lets fucking Gooooo. Still the best HTML templating langue I have used. Django is also pretty good though, but not as good or as fast.
#1722283789
So… this regex broke my site… Because it is a ‘Youtube URL’ but is has no ID so it breaks things and panics. I should really learn regex. But not now, hotfix for the win.
#1721423976
Pathetic
I like how all this artists cover arts are different levels of deranged.
#1720948971
I am officially done with ORM’s. My latest experiment was ent, a code gen based ORM for Golang. Works fine, I like the API, and then you want to do something slightly complex and it just doesn’t work. I wanted a many to many with extra data in the join table, so for so good, this did work. Until I wanted to make it not unique. I needed this because I wanted to add one track multiple times to a playlist, in my current project. But this was not allowed, the codegen would not build. Other people have the same issue but no solution is known. So my solution is to rewrite my code again, this time with pgx. I also have tried and used sqlc in some projects but it won’t scale for my current project. But I do like it a lot for smaller projects, like this blog uses it for example.
I have tried a lot of ORM’s over the year but I am finally done, not a chance. They are cool great until they are not, then they are just a pain.
#1720600927
I have taken a liking to square photo recently. All photo are quickly taken in the moment (with a shitty phone) and later edited and reframed. Just don’t look to close.
More to come, soon-ish.
#1720473853
Not a sound from the crowd when the heat come
#1720389426
Spent the last 2 weeks scraping supermarkets for a vacation job and the average website is hot garbage. One website pulled the location data of all stores in all countries (1000+ stores) to show 1 marker on a small map. The other had over 50 deep HTML elements. Not to speak of all the badly formatted data that I had to parse (including invalid JSON). Just when you think you’ve seen it all they come up with some more BS. I am going to explode.
#1719641212
New album dropped!!
The Distortion That Creates Them - Istasha
#1719158831
Everybody should write there own authentication at least once. I am currently writing a session based authentication system for one of my upcoming projects. It forced me to learn all parts of system. And once you’ve done it, authentication is no longer as scary and complicated as it might otherwise seem. Yes, it could be a vulnerability, but so can many things in your application.