D - enum/uint->int problem
- Lars Ivar Igesund (20/20) Sep 17 2003 Hi!
-
Carlos Santander B.
(25/25)
Sep 17 2003
"Lars Ivar Igesund"
wrote in message
Hi!
I'm converting parts of a headerfile (gl.h) to D.
For the defines I'm using enums according to the
header->D conversion page.
One of the enums look like this (shorted due to
space requirements):
enum {
GL_CLIENT_PIXEL_STORE_BIT = 0x00000001,
GL_CLIENT_VERTEX_ARRAY_BIT = 0x00000002,
GL_ALL_CLIENT_ATTRIB_BITS = 0xFFFFFFFF,
GL_CLIENT_ALL_ATTRIB_BITS = 0xFFFFFFFF
}
This results in the errors
gl.d(691): cannot implicitly convert uint to int
gl.d(692): cannot implicitly convert uint to int
where the last two items are lines 691 and 692.
I understand the error in itself. I guess it is because
0xFFFFFFFF is to large to be an int? I suppose I can
use const uints instead if there are no remedy.
Lars Ivar Igesund
Sep 17 2003
"Lars Ivar Igesund" <larsivi stud.ntnu.no> wrote in message
news:bk9bhi$1jll$1 digitaldaemon.com...
| ...
| enum {
| GL_CLIENT_PIXEL_STORE_BIT = 0x00000001,
| GL_CLIENT_VERTEX_ARRAY_BIT = 0x00000002,
| GL_ALL_CLIENT_ATTRIB_BITS = 0xFFFFFFFF,
| GL_CLIENT_ALL_ATTRIB_BITS = 0xFFFFFFFF
| }
| ...
|
| Lars Ivar Igesund
|
Try this:
enum : uint {
GL_CLIENT_PIXEL_STORE_BIT = 0x00000001,
GL_CLIENT_VERTEX_ARRAY_BIT = 0x00000002,
GL_ALL_CLIENT_ATTRIB_BITS = 0xFFFFFFFF,
GL_CLIENT_ALL_ATTRIB_BITS = 0xFFFFFFFF
}
—————————————————————————
Carlos Santander
---
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.518 / Virus Database: 316 - Release Date: 2003-09-11
Sep 17 2003








"Carlos Santander B." <carlos8294 msn.com>