New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ooc array sizes #878
Comments
I tried this, also: // main.c
#include "Array.h"
#include <stdio.h>
int main() {
_lang_array__Array rtest__arr;
rtest__arr = _lang_array__Array_new(float, 5 + 1);
int length = rtest__arr.length;
_lang_array__Array_set(rtest__arr, 5, float, 1337.0f);
int data5 = _lang_array__Array_get(rtest__arr, 5, float);
} // Array.h
#pragma once
#ifndef ___lang_array___
#define ___lang_array___
#define array_malloc(size) calloc(1, (size))
#include <stdint.h>
#include <stdlib.h>
#define _lang_array__Array_new(type, size) ((_lang_array__Array) { size, array_malloc(size * sizeof(type)) });
#define _lang_array__Array_get(array, index, type) ( \
((type*) array.data)[index])
#define _lang_array__Array_set(array, index, type, value) \
(((type*) array.data)[index] = value)
typedef struct {
size_t length;
void* data;
} _lang_array__Array;
#endif // ___lang_array___ and got exactly the same problem:
I have no idea how it can be 11 bytes after 9 alloc'd, though. I wanted space for 6 floats, and I addressed (4*5 = 20 = 11+9) bytes in in the set and get calls. How does 6 floats, or 5, make 9 bytes? |
OK, fixed it: #define _lang_array__Array_new(type, size) ((_lang_array__Array) { size, array_malloc((size) * sizeof(type)) }); There needs to be parentheses around I'll be very wary of |
Hahaha that is a good one, thanks for the catch! Should have written my macros more carefully in the first place :) |
Wondering if the same bug could exist for 'index' ? Maybe if comma-separated expression existed, like in C.. hmm don't we use that for generics ? But in that case they're already parenthesized.. |
The fix was also available in #879, but I guess doing it yourself was easier than rebasing and stuff. Glad you're back, if ever so briefly. |
@davidhesselbom I sometimes cherry-pick commits, but often I have slightly differing stylistic opinions on fixes (which I realize is frustrating since most of the codebase isn't up to those standards, yet I'm asking that of patches..) — so when it's just a few lines, I just do it by hand. Always try to give credit for it in the commit message though. As for #879, I just completely missed it :( |
Right, yeah, the SDK is stylistically inconsistent. When you see things like |
That's pretty insulting. Just git blamed that line, it's mine alright. Did you just grep the code for 'wtf' or do you have a particular bone to pick with Just looked around some, and here's what I found:
# define GC_REGISTER_FINALIZER(p, f, d, of, od) \
GC_debug_register_finalizer(p, f, d, of, od)
# define GC_REGISTER_FINALIZER_IGNORE_SELF(p, f, d, of, od) \
GC_debug_register_finalizer_ignore_self(p, f, d, of, od)
# define GC_REGISTER_FINALIZER_NO_ORDER(p, f, d, of, od) \
GC_debug_register_finalizer_no_order(p, f, d, of, od)
So no, I don't know what it does — I could hide it by removing the 'wtf' in the argument names, or I could leave that as a mark that if someone finds this declaration and knows more than me, well, I'm curious. Doing But hey, believe what you want. I could waste time feeling guilty over "not taking care of ooc enough", or I could go on with my day being productive. Even though it would be nice to spend more maintainer time on ooc (rock, really), I'm proud of it. It's existed since 2009, it survived me finishing university, changing jobs three times, and burning out twice. Languages in particular usually have a very short life span, just look at all our original "competition" that is now completely inactive. I'm not saying rock is the best-managed OSS project ever, but it's certainly not the worst. |
It wasn't meant to be that insulting, so, I apologize. I know you have other things to do outside of maintaining ooc - it just doesn't seem like ooc is taken very seriously, so it's hard to believe that you care that much about stylistic details in pull requests. The reason I even found the And, yes, I just looked at the official header, because I was wondering whether Naming arguments the way it's done in the header is borderline criminal, IMO, and the fact that so many C and C++ libraries out there are just as bad or even worse in that aspect (things like By the way, we're working on a tool that does some stylistic analysis of ooc code and outputs warnings (mostly whitespace related stuff, but helpful nonetheless). It might not agree with your stylistic opinions, though, and since the SDK is inconsistent, it's hard to know what the "gold standard" is. Again, I apologize. |
@fasterthanlime keep up the good work ;-) |
No need — the fact that reading such comments puts me in a bad mood and kills my motivation to spend more time fixing ooc bugs is my problem, not yours. It's never nice to read that, but in an ideal world, I should be able to keep working no matter what anyone says about my work.
Well, yes and no. Once upon a time I was really chasing people's approval on ooc, trying to please everyone, trying to "belong" and be "among the good ones", but nowadays I'm happier if people don't take ooc too seriously. It's precisely because I'm aware of ooc's flaws (limitations that appear random, compiler crashes on what should be a simple error, array nightmares, etc.) that I don't recommend it as easily as I did in the past. I grew up, I became better aware of what solid software is, and rock (despite its name) isn't. I don't see a contradiction between me not having enough time to bring rock up to my standards, yet spend a few minutes polishing incoming pull requests so at least new code makes me happy (by standards that I haven't written anywhere, see point A). If anything, naming parameters "wtf" is very idiomatic ooc. If you don't understand something, be loud about it. Just read all the code @Shamans wrote for rock - some of it isn't solid at all, and when it isn't, there's comments that say so. Then again, some features he wrote aren't publicized in the docs for that very reason. (Same goes for me, see cover templates for example). But I do have an issue with the term "taking seriously", because it's so vague. During my open source "career", I've met very serious people, and in my experience, it doesn't always end well. Sometimes it takes month to undo the damage. Over-design, lots of YAGNI, a general tendency to abstract everything and anything. Being serious about something can have different meanings. To me, the fact that I'm actively using (and occasionally maintaining) ooc after 6 years is (again) a personal victory. (Note: I'm writing a lot of ooc code for my game Jaakan that isn't open-source — yet). There's been hard bugs to solve, hard contributors to deal with, the temptation to give it all up and use newfangled language X or Y instead, but I stuck to it — even though my day job no longer involves ooc. That said, I definitely understand the frustration. My day job does involve working on a niche programming language someone with high standards and little time made as a hobby, and it's not always easy getting in the mind of the original designer. But to say that I don't care is really not knowing me. I can't blame you since we haven't met, but a more accurate assessment of the situation is that I've started to acknowledge that some things are beyond my control. Sometimes, to ship, we make compromises. That's life. Proprietary software sells for a lot of money and their codebase is often much worse than open-source projects. Does that mean it's not taken seriously or that they don't care?
Definitely agreed. But C/C++/other clusterfucks have this advantage that they have a solid community around them that somehow manage to cobble together great things (the complement to the Lisp curse, if you will).
I'm indeed interested, and it sounds like a good candidate to run on the SDK & rock's codebase. Let me know how that goes. |
you mean to "indirect" :P |
I don't want to take more of your time, so I'll try to be brief, but:
That's not true, and you know it. The more time you spend working on ooc, the less time me and my colleagues have to spend trying to figure out why things don't work as expected. When you get around to it, you get several weeks' worth of issues fixed in a single night. Insulting you, putting you in a bad mood, and having you write long comments to defend yourself from accusations by People Who Are Wrong On The Internet is about the dumbest thing I can do, and I should really know better. It really does make me happy to see you around, because I've taken a liking to the language itself, and I hate the fact that I can't trust its compiler. I can agree "serious" wasn't the best choice of words. Personally, I think the lighthearted language used in parts of the SDK is uplifting, some days, but when you're having a crappy day, a deadline's coming up, and code that should and previously did work suddenly doesn't, that lighthearted language just makes me scared.
Glad to hear it, I'll keep you posted. A first version should be out next week or so. |
Well, I get your point (and appreciate the kind words), but really, no, it's my problem. You don't "put me in a bad mood", you say things that I choose to interpret in a certain vexing way, and allow them to make me feel bad. If I can fix that problem, I'll be invincible :)
Yes, that's why I don't want to rely on ooc for anything crucial. As the maintainer-dude, I know how to work around most bugs I see, and when I really cant or have a bit of time, I fix them. #880 and #881 represent about 5 hours lost trying to debug GLSL shaders, projection matrix code, etc. so believe me, I know how annoying it is :( Then again it's not like I can just put the world on pause for a week and fix ALL the issues. Some bugs act like dams — they keep users away from some areas that are really unstable, because the bug occurs so soon that they're discouraged to go any further. But, fix that bug, and it opens up so many new possibilities. "Now that we can do this, how about that?" And you end up with even more features to support, and even unhappier users because since it "almost works now" they expect you to get the rest done (even if that means refactoring a third of the compiler because initial support was a hack in the first place), but your amount of free time has remained the same. How do you think we ended up with full generics support? It was supposed to be a very simple thing just so we could get ArrayLists of anything, and look at them now! Don't feel bad for making me write long-winded answers, that's, again, a trait of mine. Hopefully they're not quite Linus-levels of inflammatory... it actually kind of feels good to write a little about my personal experience with ooc. Since OSCON, I haven't really had a chance to do that, so, it's cool. 👍 |
There is always someone who cares. |
Check this out:
When compiled without the GC, checking the resulting binary with valgrind, it reports no errors. However, if you leave out the parentheses (i.e. switch line 2 for line 1), you get read and write errors. The array
length
value will stay the same, but you only malloc enough space for52.25 Floats, not 6, if you leave out the parentheses.You can access the last element of the array without getting out of bounds exceptions, but there's a risk you won't find what you put there, because that memory is being used by someone else.
I looked at the C code, and it looks fine to me:
rtest__arr = _lang_array__Array_new(lang_Numbers__Float, 5 + 1);
The parentheses around
5 + 1
are there if they're there in the ooc code, otherwise no. But why in the world would the parentheses matter?The text was updated successfully, but these errors were encountered: