-
Notifications
You must be signed in to change notification settings - Fork 66
Add (optional) -w option to support 16-bit word output #5
base: master
Are you sure you want to change the base?
Conversation
Add file_count to fix size in words
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the contribution! Sorry about how much of a mess this code base is... I'll clean it up at some point.
I like the optional array name feature, that is very nice.
I have some code cleanup comments for you (I'm happy to do this if you don't feel like it). Would also be nice to extend the test to cover the optional name of the array and the -w
flag, (maybe I'll play around with travis for some CI action here).
If you're in the mood for doing it, I don't like the K&R variable declarations at the start of the function, so if you can (at least for all the new variables you introduced) declare them at first use. You may need to add a -std=c99
in the Makefile
if your compiler complains about that, but it'll clean up the code a lot I hope. Maybe extract a function here or there if you want to (the bit that does the underscoring seems a good candidate).
bin2c.c
Outdated
ident = argv[3]; | ||
// Check if array_name passed on command line | ||
if (arg < argc) { | ||
strcpy(array_name, argv[argc]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should probably use strncpy
here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Alternatively, could malloc
the correct size (strdup
comes to mind). Just don't forget to free it at the end :D
strcpy(array_name, filename); | ||
// Replace non-alphanumeric chars with underscore | ||
for (i = 0; (ch = array_name[i]) != '\0'; ++i) { | ||
if (!isalpha(ch) && !isdigit(ch)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you going to handle the case where the first character is a number?
|
||
fprintf(f_output, "const char %s[%i] = {", ident, file_size); | ||
for (i = 0; i < file_size; ++i) { | ||
fprintf(f_output, "const unsigned short %s[%i] = {", array_name, file_count); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please keep consistent identation :)
unsigned int i, file_size, need_comma; | ||
unsigned char *buf; | ||
char array_name[80]; | ||
unsigned int i, file_size, file_count, need_comma; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about array_size
instead of file_count
?
bin2c.c
Outdated
fprintf(f_output, "const char %s[%i] = {", ident, file_size); | ||
for (i = 0; i < file_size; ++i) { | ||
if (incr > 1) { | ||
fprintf(f_output, "const unsigned short %s[%i] = {", ident, file_size); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In theory and in most cases it is really implemented that way.
Short is 16 bit AT LEAST(!) by ANSI-C standart, but you can not guarantee it, it is compiler-implementation-dependent.
(Only the minimum/maximum values have to be assured, but the developers might have decided to exceed them for some unknown reasons)
If you want to be absolutely sure, better use "uint16_t" instead, this is guaranteed to really be 16 bit in size.
(OK, in most cases this is a typedef to unsigned short, but better be safe than sorry.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
almost forgot: for uint16_t to be recognized, the library <stdint.h> has to be included.
I made a minor change to support an (optional) -w flag to support 16-bit word output in addition to the 8-bit byte output.