oath_base32_decode(3) | liboath | oath_base32_decode(3) |
NAME
oath_base32_decode - API function
SYNOPSIS
#include <oath.h>
int oath_base32_decode(const char * in, size_t inlen, char ** out, size_t * outlen);
ARGUMENTS
- const char * in
- input string with base32 encoded data of length inlen
- size_t inlen
- length of input base32 string in
- char ** out
- pointer to output variable for binary data of length outlen, or NULL
- size_t * outlen
- pointer to output variable holding length of out, or NULL
DESCRIPTION
Decode a base32 encoded string into binary data.
Space characters are ignored and pad characters are added if needed. Non-base32 data are not ignored but instead will lead to an OATH_INVALID_BASE32 error.
The in parameter should contain inlen bytes of base32 encoded data. The function allocates a new string in *out to hold the decoded data, and sets *outlen to the length of the data.
If out is NULL, then *outlen will be set to what would have been the length of *out on successful encoding.
If the caller is not interested in knowing the length of the output data out, then outlen may be set to NULL.
It is permitted but useless to have both out and outlen NULL.
RETURNS
On success OATH_OK (zero) is returned, OATH_INVALID_BASE32 is returned if the input contains non-base32 characters, and OATH_MALLOC_ERROR is returned on memory allocation errors.
SINCE
1.12.0
REPORTING BUGS
Report bugs to <oath-toolkit-help@nongnu.org>. liboath home page: https://www.nongnu.org/oath-toolkit/ General help using GNU software: http://www.gnu.org/gethelp/
COPYRIGHT
Copyright © 2009-2020 Simon Josefsson.
Copying and distribution of this file, with or without modification, are
permitted in any medium without royalty provided the copyright notice and
this notice are preserved.
2.6.12 | liboath |