www.digitalmars.com         C & C++   DMDScript  

D - enum/uint->int problem

reply "Lars Ivar Igesund" <larsivi stud.ntnu.no> writes:
Hi!

I'm converting parts of a headerfile (gl.h) to D.
For the defines I'm using enums according to the
header->D conversion page.

One of the enums look like this (shorted due to
space requirements):

enum {
  GL_CLIENT_PIXEL_STORE_BIT = 0x00000001,
  GL_CLIENT_VERTEX_ARRAY_BIT = 0x00000002,
  GL_ALL_CLIENT_ATTRIB_BITS = 0xFFFFFFFF,
  GL_CLIENT_ALL_ATTRIB_BITS = 0xFFFFFFFF
}

This results in the errors
gl.d(691): cannot implicitly convert uint to int
gl.d(692): cannot implicitly convert uint to int

where the last two items are lines 691 and 692.

I understand the error in itself. I guess it is because
0xFFFFFFFF is to large to be an int? I suppose I can
use const uints instead if there are no remedy.

Lars Ivar Igesund
Sep 17 2003
parent "Carlos Santander B." <carlos8294 msn.com> writes:
"Lars Ivar Igesund" <larsivi stud.ntnu.no> wrote in message
news:bk9bhi$1jll$1 digitaldaemon.com...
| ...
| enum {
|   GL_CLIENT_PIXEL_STORE_BIT = 0x00000001,
|   GL_CLIENT_VERTEX_ARRAY_BIT = 0x00000002,
|   GL_ALL_CLIENT_ATTRIB_BITS = 0xFFFFFFFF,
|   GL_CLIENT_ALL_ATTRIB_BITS = 0xFFFFFFFF
| }
| ...
|
| Lars Ivar Igesund
|

Try this:

enum : uint {
  GL_CLIENT_PIXEL_STORE_BIT = 0x00000001,
  GL_CLIENT_VERTEX_ARRAY_BIT = 0x00000002,
  GL_ALL_CLIENT_ATTRIB_BITS = 0xFFFFFFFF,
  GL_CLIENT_ALL_ATTRIB_BITS = 0xFFFFFFFF
}

—————————————————————————
Carlos Santander


---

Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.518 / Virus Database: 316 - Release Date: 2003-09-11
Sep 17 2003