Convert a UTF16 character to a UTF32 character.
Source position: character.pas line 104
public class function TCharacter.ConvertToUtf32( |
const AString: UnicodeString; |
AIndex: Integer |
):UCS4Char; overload; |
const AString: UnicodeString; |
AIndex: Integer; |
out ACharLength: Integer |
):UCS4Char; overload; |
const AHighSurrogate: UnicodeChar; |
const ALowSurrogate: UnicodeChar |
):UCS4Char; overload; |
AString |
|
String containing the character to encode. |
AIndex |
|
Index at which the character to encode is located. |
The Unicode character in UTF32 format.
AString |
|
String containing the character to encode. |
AIndex |
|
Index at which the character to encode is located. |
ACharLength |
|
Character length (in UTF16 characters). |
AHighSurrogate |
|
The high pair of the surrogate. |
ALowSurrogate |
|
The low pair of the surrogate. |
TCharacter.ConvertToUtf32 converts a UTF16-encoded Unicode character to a Unicode32 character. This is the opposite of TCharacter.ConvertFromUtf32. The function exists in several overloaded versions, to be able to present the Unicode character in one of 2 ways:
If AIndex is not a valid character index in the string AString, an EArgumentOutOfRangeException exception is raised. If the character at that position is not complete, an EArgumentException exception is raised.
|
Convert a UTF32 character to UnicodeString. |
|
|
Argument out of valid range passed to a function. |
|
|
Invalid argument passed to a function. |