Amazing, yes you are completely right, that's the reason it was always failing. In gcc, I added
LOCAL_CFLAGS += -fsigned-char
to the make file and it works perfectly and decodes like a charm! As I read, the default of a char is not really defined in c and depends on the architecture, so this error makes perfectly sens. But may there would be a way to explicitly init is as a signed char directly in code so that these things will not happen to others?
Amazing, yes you are completely right, that's the reason it was always failing. In gcc, I added
LOCAL_CFLAGS += -fsigned-char
to the make file and it works perfectly and decodes like a charm! As I read, the default of a char is not really defined in c and depends on the architecture, so this error makes perfectly sens. But may there would be a way to explicitly init is as a signed char directly in code so that these things will not happen to others?
Thanks again!