www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.learn - Unicode arithmetic at run-time

reply "Charles McAnany" <dlang charlesmcanany.com> writes:
Friends,
I note that there are playing cards in unicode:
http://en.wikipedia.org/wiki/Playing_cards_in_Unicode

They follow a nice pattern, so I can quickly convert from a rank 
and suit to the appropriate escape sequence in D. I'd like to 
automate this, but I can't seem to do arithmetic on unicode 
characters as I could in ascii.

writefln("%c", '/U0001F0A1'); //works fine, ace of spades. 
Backslash replaced with / for displayability.
//So logically, this would be the two of spades:
writefln("%c", '/U0001F0A1'+1); // 
std.format.FormatException format.d(1325): integral

Would this be solvable with a mixin (return "//U00001F0A" ~ 
rank;), or are escape sequences impossible to generate after the 
source has been lexed by dmd?

I looked in std.uni and std.utf, but they don't seem to want to 
generate a unicode character from an int, they're more concerned 
about switching between encodings.

Cheers,
Charles.
Sep 20 2014
next sibling parent reply "Adam D. Ruppe" <destructionator gmail.com> writes:
On Sunday, 21 September 2014 at 03:00:34 UTC, Charles McAnany 
wrote:
 writefln("%c", '/U0001F0A1'+1); //
The problem here is just that arithmetic converts everything back to integers and writefln is a bit picky about types. You can print it though by casting it back to dchar: writefln("%c", cast(dchar)('\U0001F0A1'+1)); My fonts don't support these chars but it should print out if you do that.
Sep 20 2014
parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Sunday, 21 September 2014 at 03:13:19 UTC, Adam D. Ruppe wrote:
         writefln("%c", cast(dchar)('\U0001F0A1'+1));
A bit less ugly: writefln("%c", dchar('\U0001F0A1'+1));
Sep 21 2014
parent ketmar via Digitalmars-d-learn <digitalmars-d-learn puremagic.com> writes:
On Mon, 22 Sep 2014 01:02:32 +0000
Vladimir Panteleev via Digitalmars-d-learn
<digitalmars-d-learn puremagic.com> wrote:

 A bit less ugly:
 writefln("%c", dchar('\U0001F0A1'+1));
this won't work in gdc, though: 2.066 it not landed yet.
Sep 21 2014
prev sibling next sibling parent reply ketmar via Digitalmars-d-learn <digitalmars-d-learn puremagic.com> writes:
On Sun, 21 Sep 2014 03:00:32 +0000
Charles McAnany via Digitalmars-d-learn
<digitalmars-d-learn puremagic.com> wrote:

can't this help:

  writefln("%c", cast(dchar)('\U0001F0A1'+1));

arichmetics with dchars automatically coerces to int, so you must cast
result back to dchar.
Sep 20 2014
parent reply =?UTF-8?B?QWxpIMOHZWhyZWxp?= <acehreli yahoo.com> writes:
On 09/20/2014 08:14 PM, ketmar via Digitalmars-d-learn wrote:

 arichmetics with dchars automatically coerces to int
My unimportant contribution to this thread: It is actually uint in this case. :) Ali
Sep 20 2014
parent ketmar via Digitalmars-d-learn <digitalmars-d-learn puremagic.com> writes:
On Sat, 20 Sep 2014 22:12:52 -0700
Ali =C3=87ehreli via Digitalmars-d-learn <digitalmars-d-learn puremagic.com>
wrote:

 My unimportant contribution to this thread: It is actually uint in
 this case. :)
ah, sure. i just lost that 'u' somewhere... gotta find it. ;-)
Sep 20 2014
prev sibling parent "kiran kumari" <kiranfabzen gmail.com> writes:
On Sunday, 21 September 2014 at 03:00:34 UTC, Charles McAnany
wrote:
 Friends,
 I note that there are playing cards in unicode:
 http://en.wikipedia.org/wiki/Playing_cards_in_Unicode

 They follow a nice pattern, so I can quickly convert from a 
 rank and suit to the appropriate escape sequence in D. I'd like 
 to automate this, but I can't seem to do arithmetic on unicode 
 characters as I could in ascii.

 writefln("%c", '/U0001F0A1'); //works fine, ace of spades. 
 Backslash replaced with / for displayability.
 //So logically, this would be the two of spades:
 writefln("%c", '/U0001F0A1'+1); // 
 std.format.FormatException format.d(1325): integral

 Would this be solvable with a mixin (return "//U00001F0A" ~ 
 rank;), or are escape sequences impossible to generate after 
 the source has been lexed by dmd?

 I looked in std.uni and std.utf, but they don't seem to want to 
 generate a unicode character from an int, they're more 
 concerned about switching between encodings.

 Cheers,
 Charles.
see more example http://techgurulab.com/course/java-quiz-online/
Sep 24 2014